Skip to main content
Glama

MindBridge MCP Server

.env.example1.27 kB
# Second Opinion MCP Server Configuration # OpenAI Configuration OPENAI_API_KEY= # OpenAI uses a fixed endpoint: https://api.openai.com/v1 # Anthropic Configuration ANTHROPIC_API_KEY= # Anthropic uses a fixed endpoint: https://api.anthropic.com # DeepSeek Configuration DEEPSEEK_API_KEY= # DeepSeek uses a fixed endpoint: https://api.deepseek.com # Google AI Configuration GOOGLE_API_KEY= # Google uses a fixed endpoint: https://generativelanguage.googleapis.com/v1beta # OpenRouter Configuration OPENROUTER_API_KEY= # OpenRouter doesn't require a base URL as it's fixed # OpenAI-Compatible API Configuration (for third-party services that use OpenAI's API format) OPENAI_COMPATIBLE_API_KEY= # Optional: Some services don't require an API key OPENAI_COMPATIBLE_API_BASE_URL= # Required: Full URL to the API endpoint OPENAI_COMPATIBLE_API_MODELS= # Optional: Comma-separated list of available models # Ollama Configuration OLLAMA_BASE_URL=http://localhost:11434 # Change if running Ollama on a different host # Optional: Default parameters DEFAULT_TEMPERATURE=0.7 # Default temperature for non-reasoning models # Note: Max tokens are model-specific and should be set per request DEFAULT_REASONING_EFFORT=medium # Default reasoning effort for supported models

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pinkpixel-dev/mindbridge-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server