We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/wx-b/long-context-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
litellm.env.example•380 B
# LiteLLM Proxy Configuration
# Install and configure LiteLLM proxy server
# API key for LiteLLM proxy (configure in your proxy)
LITELLM_API_KEY=your_litellm_proxy_key_here
# The base URL should point to your LiteLLM proxy server
# LITELLM_BASE_URL=http://localhost:4000/v1
# Additional LiteLLM proxy configuration
# Configure routing and keys in your LiteLLM proxy config.yaml