Integrations
Provides access to locally running LLM models via LM Studio's OpenAI-compatible API endpoints, enabling text generation with custom parameters like temperature and token limits.
LMStudio-MCP
A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.
Overview
LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:
- Check the health of your LM Studio API
- List available models
- Get the currently loaded model
- Generate completions using your local models
This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.
Prerequisites
- Python 3.7+
- LM Studio installed and running locally with a model loaded
- Claude with MCP access
- Required Python packages (see Installation)
Installation
- Clone this repository:Copy
- Install the required packages:Copy
MCP Configuration
For Claude to connect to this bridge, you need to configure the MCP settings properly. You can either:
- Use directly from GitHub:Copy
- Use local installation:Copy
For detailed MCP configuration instructions, see MCP_CONFIGURATION.md.
Usage
- Start your LM Studio application and ensure it's running on port 1234 (the default)
- Load a model in LM Studio
- If running locally (not using
uvx
), run the LMStudio-MCP server:Copy - In Claude, connect to the MCP server when prompted by selecting "lmstudio-mcp"
Available Functions
The bridge provides the following functions:
health_check()
: Verify if LM Studio API is accessiblelist_models()
: Get a list of all available models in LM Studioget_current_model()
: Identify which model is currently loadedchat_completion(prompt, system_prompt, temperature, max_tokens)
: Generate text from your local model
Known Limitations
- Some models (e.g., phi-3.5-mini-instruct_uncensored) may have compatibility issues
- The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio
- Model responses will be limited by the capabilities of your locally loaded model
Troubleshooting
API Connection Issues
If Claude reports 404 errors when trying to connect to LM Studio:
- Ensure LM Studio is running and has a model loaded
- Check that LM Studio's server is running on port 1234
- Verify your firewall isn't blocking the connection
- Try using "127.0.0.1" instead of "localhost" in the API URL if issues persist
Model Compatibility
If certain models don't work correctly:
- Some models might not fully support the OpenAI chat completions API format
- Try different parameter values (temperature, max_tokens) for problematic models
- Consider switching to a more compatible model if problems persist
For more detailed troubleshooting help, see TROUBLESHOOTING.md.
License
MIT
Acknowledgements
This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".
This server cannot be installed
A bridge that allows Claude to communicate with locally running LLM models via LM Studio, enabling users to leverage their private models through Claude's interface.
Related MCP Servers
- -securityAlicense-qualityEnables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.Last updated -16547TypeScriptAGPL 3.0
- -securityAlicense-qualityA comprehensive toolkit that enhances LLM capabilities through the Model Context Protocol, allowing LLMs to interact with external services including command-line operations, file management, Figma integration, and audio processing.Last updated -12PythonApache 2.0
- -securityAlicense-qualityA FreeCAD addon that implements the Model Context Protocol (MCP) to enable communication between FreeCAD and Claude AI through Claude Desktop.Last updated -17PythonMIT License
Fused MCP Agentsofficial
-securityAlicense-qualityA Python-based MCP server that allows Claude and other LLMs to execute arbitrary Python code directly through your desktop Claude app, enabling data scientists to connect LLMs to APIs and executable code.Last updated -23MIT License