Claude-LMStudio Bridge
local-only server
The server can only run on the client’s local machine because it depends on local resources.
Integrations
Claude-LMStudio Bridge
An MCP server that bridges Claude with local LLMs running in LM Studio.
Overview
This tool allows Claude to interact with your local LLMs running in LM Studio, providing:
- Access to list all available models in LM Studio
- The ability to generate text using your local LLMs
- Support for chat completions through your local models
- A health check tool to verify connectivity with LM Studio
Prerequisites
- Claude Desktop with MCP support
- LM Studio installed and running locally with API server enabled
- Python 3.8+ installed
Quick Start (Recommended)
For macOS/Linux:
- Clone the repository
- Run the setup script
- Follow the setup script's instructions to configure Claude Desktop
For Windows:
- Clone the repository
- Run the setup script
- Follow the setup script's instructions to configure Claude Desktop
Manual Setup
If you prefer to set things up manually:
- Create a virtual environment (optional but recommended)
- Install the required packages
- Configure Claude Desktop:
- Open Claude Desktop preferences
- Navigate to the 'MCP Servers' section
- Add a new MCP server with the following configuration:
- Name: lmstudio-bridge
- Command: /bin/bash (on macOS/Linux) or cmd.exe (on Windows)
- Arguments:
- macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
- Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat
Usage with Claude
After setting up the bridge, you can use the following commands in Claude:
- Check the connection to LM Studio:
- List available models:
- Generate text with a local model:
- Send a chat completion:
Troubleshooting
Diagnosing LM Studio Connection Issues
Use the included debugging tool to check your LM Studio connection:
For more detailed tests:
Common Issues
"Cannot connect to LM Studio API"
- Make sure LM Studio is running
- Verify the API server is enabled in LM Studio (Settings > API Server)
- Check that the port (default: 1234) matches what's in your .env file
"No models are loaded"
- Open LM Studio and load a model
- Verify the model is running successfully
"MCP package not found"
- Try reinstalling:
pip install "mcp[cli]" httpx python-dotenv
- Make sure you're using Python 3.8 or later
"Claude can't find the bridge"
- Check Claude Desktop configuration
- Make sure the path to run_server.sh or run_server.bat is correct and absolute
- Verify the server script is executable:
chmod +x run_server.sh
(on macOS/Linux)
Advanced Configuration
You can customize the bridge behavior by creating a .env
file with these settings:
Set DEBUG=true
to enable verbose logging for troubleshooting.
License
MIT
This server cannot be installed
An MCP server that allows Claude to interact with local LLMs running in LM Studio, providing access to list models, generate text, and use chat completions through local models.