Skip to main content
Glama

Self-hosted LLM MCP Server

docker-compose.yml1.88 kB
version: '3.8' services: # Ollama service for self-hosted LLM ollama: image: ollama/ollama:latest container_name: mcp-ollama ports: - "11434:11434" volumes: - ollama_data:/root/.ollama environment: - OLLAMA_HOST=0.0.0.0 restart: unless-stopped healthcheck: test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:11434/api/tags"] interval: 30s timeout: 10s retries: 3 start_period: 40s # MCP Server mcp-server: build: . container_name: mcp-server ports: - "4000:4000" environment: - SUPABASE_URL=${SUPABASE_URL} - SUPABASE_ANON_KEY=${SUPABASE_ANON_KEY} - SUPABASE_SERVICE_ROLE_KEY=${SUPABASE_SERVICE_ROLE_KEY} - LLM_BASE_URL=http://ollama:11434 - LLM_MODEL=${LLM_MODEL:-llama2} - LLM_TIMEOUT=${LLM_TIMEOUT:-30000} - MCP_SERVER_PORT=3000 - MCP_SERVER_HOST=0.0.0.0 - LOG_LEVEL=${LOG_LEVEL:-info} - LOG_FORMAT=${LOG_FORMAT:-json} depends_on: - ollama restart: unless-stopped healthcheck: test: ["CMD", "node", "-e", "require('http').get('http://localhost:3000/health', (res) => { process.exit(res.statusCode === 200 ? 0 : 1) })"] interval: 30s timeout: 10s retries: 3 start_period: 30s # Optional: Supabase Local Development (uncomment if using local Supabase) # supabase: # image: supabase/postgres:15.1.0.117 # container_name: mcp-supabase-db # ports: # - "5432:5432" # environment: # POSTGRES_PASSWORD: your_password_here # POSTGRES_DB: postgres # volumes: # - supabase_data:/var/lib/postgresql/data # restart: unless-stopped volumes: ollama_data: # supabase_data: networks: default: name: mcp-network

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Krishnahuex28/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server