Skip to main content
Glama

LMStudio-MCP

LMStudio-MCP

A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.

Overview

LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:

  • Check the health of your LM Studio API
  • List available models
  • Get the currently loaded model
  • Generate completions using your local models

This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.

Prerequisites

  • Python 3.7+
  • LM Studio installed and running locally with a model loaded
  • Claude with MCP access
  • Required Python packages (see Installation)

🚀 Quick Installation

curl -fsSL https://raw.githubusercontent.com/infinitimeless/LMStudio-MCP/main/install.sh | bash

Manual Installation Methods

1. Local Python Installation
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP pip install requests "mcp[cli]" openai
2. Docker Installation
# Using pre-built image docker run -it --network host ghcr.io/infinitimeless/lmstudio-mcp:latest # Or build locally git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP docker build -t lmstudio-mcp . docker run -it --network host lmstudio-mcp
3. Docker Compose
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP docker-compose up -d

For detailed deployment instructions, see DOCKER.md.

MCP Configuration

Quick Setup

Using GitHub directly (simplest):

{ "lmstudio-mcp": { "command": "uvx", "args": [ "https://github.com/infinitimeless/LMStudio-MCP" ] } }

Using local installation:

{ "lmstudio-mcp": { "command": "/bin/bash", "args": [ "-c", "cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py" ] } }

Using Docker:

{ "lmstudio-mcp-docker": { "command": "docker", "args": [ "run", "-i", "--rm", "--network=host", "ghcr.io/infinitimeless/lmstudio-mcp:latest" ] } }

For complete MCP configuration instructions, see MCP_CONFIGURATION.md.

Usage

  1. Start LM Studio and ensure it's running on port 1234 (the default)
  2. Load a model in LM Studio
  3. Configure Claude MCP with one of the configurations above
  4. Connect to the MCP server in Claude when prompted

Available Functions

The bridge provides the following functions:

  • health_check(): Verify if LM Studio API is accessible
  • list_models(): Get a list of all available models in LM Studio
  • get_current_model(): Identify which model is currently loaded
  • chat_completion(prompt, system_prompt, temperature, max_tokens): Generate text from your local model

Deployment Options

This project supports multiple deployment methods:

MethodUse CaseProsCons
Local PythonDevelopment, simple setupFast, direct controlRequires Python setup
DockerIsolated environmentsClean, portableRequires Docker
Docker ComposeProduction deploymentsEasy managementMore complex setup
KubernetesEnterprise/scaleHighly scalableComplex configuration
GitHub DirectZero setupNo local install neededRequires internet

Known Limitations

  • Some models (e.g., phi-3.5-mini-instruct_uncensored) may have compatibility issues
  • The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio
  • Model responses will be limited by the capabilities of your locally loaded model

Troubleshooting

API Connection Issues

If Claude reports 404 errors when trying to connect to LM Studio:

  • Ensure LM Studio is running and has a model loaded
  • Check that LM Studio's server is running on port 1234
  • Verify your firewall isn't blocking the connection
  • Try using "127.0.0.1" instead of "localhost" in the API URL if issues persist

Model Compatibility

If certain models don't work correctly:

  • Some models might not fully support the OpenAI chat completions API format
  • Try different parameter values (temperature, max_tokens) for problematic models
  • Consider switching to a more compatible model if problems persist

For detailed troubleshooting help, see TROUBLESHOOTING.md.

🐳 Docker & Containerization

This project includes comprehensive Docker support:

  • Multi-architecture images (AMD64, ARM64/Apple Silicon)
  • Automated builds via GitHub Actions
  • Pre-built images available on GitHub Container Registry
  • Docker Compose for easy deployment
  • Kubernetes manifests for production deployments

See DOCKER.md for complete containerization documentation.

Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

License

MIT

Acknowledgements

This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".


🌟 If this project helps you, please consider giving it a star!

-
security - not tested
A
license - permissive license
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

Claude が LM Studio を介してローカルで実行されている LLM モデルと通信できるようにするブリッジ。これにより、ユーザーは Claude のインターフェースを通じてプライベート モデルを活用できるようになります。

  1. 概要
    1. 前提条件
      1. インストール
        1. MCP構成
          1. 使用法
            1. 利用可能な機能
              1. 既知の制限事項
                1. トラブルシューティング
                  1. API接続の問題
                  2. モデルの互換性
                2. ライセンス
                  1. 謝辞

                    Related MCP Servers

                    • -
                      security
                      A
                      license
                      -
                      quality
                      Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
                      Last updated -
                      94
                      97
                      TypeScript
                      AGPL 3.0
                    • A
                      security
                      F
                      license
                      A
                      quality
                      A bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.
                      Last updated -
                      10
                      71
                      JavaScript
                      • Apple
                    • -
                      security
                      A
                      license
                      -
                      quality
                      A lightweight bridge extension that enables Claude Desktop to connect to VSCode workspaces via Model Context Protocol, allowing Claude to read and write files, create directories, and list contents in your workspace.
                      Last updated -
                      8
                      TypeScript
                      MIT License
                    • -
                      security
                      F
                      license
                      -
                      quality
                      An MCP server that allows Claude to interact with local LLMs running in LM Studio, providing access to list models, generate text, and use chat completions through local models.
                      Last updated -
                      10
                      Python

                    View all related MCP servers

                    MCP directory API

                    We provide all the information about MCP servers via our MCP API.

                    curl -X GET 'https://glama.ai/api/mcp/v1/servers/infinitimeless/LMStudio-MCP'

                    If you have feedback or need assistance with the MCP directory API, please join our Discord server