Skip to main content
Glama
.env.exampleβ€’950 B
# MCP Server Configuration MCP_SERVER_NAME=ai-mcp-gateway MCP_SERVER_VERSION=0.1.0 # API Keys for LLM Providers OPENROUTER_API_KEY=your_openrouter_key_here ANTHROPIC_API_KEY=your_anthropic_key_here OPENAI_API_KEY=your_openai_key_here # OSS/Local Model Configuration (optional) OSS_MODEL_ENDPOINT=http://localhost:11434 OSS_MODEL_ENABLED=false # Redis Configuration REDIS_HOST=localhost REDIS_PORT=6379 REDIS_PASSWORD= REDIS_DB=0 # PostgreSQL Database Configuration DATABASE_URL= DB_HOST=localhost DB_PORT=5432 DB_NAME=ai_mcp_gateway DB_USER=postgres DB_PASSWORD= DB_SSL=false # HTTP API Configuration API_PORT=3000 API_HOST=0.0.0.0 API_CORS_ORIGIN=* # Logging LOG_LEVEL=info LOG_FILE=logs/ai-mcp-gateway.log # Routing Configuration DEFAULT_LAYER=L0 ENABLE_CROSS_CHECK=true ENABLE_AUTO_ESCALATE=true MAX_ESCALATION_LAYER=L2 # Cost Tracking ENABLE_COST_TRACKING=true COST_ALERT_THRESHOLD=1.00 # Mode: mcp (stdio) or api (HTTP server) MODE=mcp

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/babasida246/ai-mcp-gateway'

If you have feedback or need assistance with the MCP directory API, please join our Discord server