Skip to main content
Glama

LMStudio-MCP

docker-compose.yml1 kB
version: '3.8' services: lmstudio-mcp: build: . container_name: lmstudio-mcp-server restart: unless-stopped network_mode: "host" # Required to access LM Studio on localhost environment: - LMSTUDIO_API_BASE=http://localhost:1234/v1 volumes: - ./logs:/app/logs # Mount logs directory for persistence stdin_open: true # Required for MCP stdio communication tty: true command: ["python", "lmstudio_bridge.py"] healthcheck: test: ["CMD", "python", "-c", "import lmstudio_bridge; print('OK')"] interval: 30s timeout: 10s retries: 3 start_period: 10s # Optional: LM Studio service (if running in container) # This is for reference - you may prefer to run LM Studio natively # lmstudio: # image: "lmstudio/server:latest" # This is hypothetical # container_name: lmstudio-server # ports: # - "1234:1234" # volumes: # - ./models:/models # environment: # - MODEL_PATH=/models

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/infinitimeless/LMStudio-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server