Skip to main content
Glama
.env.example4.97 kB
# This is an example .env file. Copy it to .env and fill in your values. # Lines starting with # are comments. # ============================================================================= # LLM CONFIGURATION # ============================================================================= # Provider selection (required) # Options: openai, anthropic, google, azure_openai, groq, deepseek, # cerebras, ollama, bedrock, browser_use, openrouter, vercel MCP_LLM_PROVIDER=anthropic # Model name (provider-specific) MCP_LLM_MODEL_NAME=claude-sonnet-4-20250514 # Optional: Custom base URL for OpenAI-compatible APIs (e.g., vLLM, local servers) # MCP_LLM_BASE_URL= # ============================================================================= # API KEYS - Standard Environment Variable Names (Recommended) # ============================================================================= # The server automatically detects these based on your provider setting. # You do NOT need to use MCP_LLM_ prefix for API keys anymore. # # Priority order (highest to lowest): # 1. MCP_LLM_API_KEY (generic override, applies to any provider) # 2. <PROVIDER>_API_KEY (standard name below) # 3. MCP_LLM_<PROVIDER>_API_KEY (legacy, backward compatible) # OpenAI # OPENAI_API_KEY=sk-... # Anthropic (works automatically in Claude Code context!) # ANTHROPIC_API_KEY=sk-ant-... # Google / Gemini (GEMINI_API_KEY takes priority over GOOGLE_API_KEY) # GEMINI_API_KEY=... # GOOGLE_API_KEY=... # Groq # GROQ_API_KEY=gsk_... # DeepSeek # DEEPSEEK_API_KEY=... # Cerebras # CEREBRAS_API_KEY=... # OpenRouter # OPENROUTER_API_KEY=sk-or-... # Browser Use Cloud # BROWSER_USE_API_KEY=... # Vercel AI Gateway # VERCEL_API_KEY=... # ============================================================================= # PROVIDER-SPECIFIC CONFIGURATION # ============================================================================= # --- Azure OpenAI --- # Requires endpoint in addition to API key # AZURE_OPENAI_API_KEY=... # MCP_LLM_AZURE_ENDPOINT=https://your-resource.openai.azure.com # MCP_LLM_AZURE_API_VERSION=2024-02-01 # --- AWS Bedrock --- # Uses standard AWS credentials (no API key needed) # AWS_ACCESS_KEY_ID=... # AWS_SECRET_ACCESS_KEY=... # AWS_DEFAULT_REGION=us-east-1 # MCP_LLM_AWS_REGION=us-east-1 # --- Ollama (local, no API key needed) --- # MCP_LLM_BASE_URL=http://localhost:11434 # ============================================================================= # LEGACY SUPPORT (Backward Compatibility) # ============================================================================= # These MCP_LLM_ prefixed keys still work but standard names above are preferred. # MCP_LLM_OPENAI_API_KEY=... # MCP_LLM_ANTHROPIC_API_KEY=... # MCP_LLM_GOOGLE_API_KEY=... # ============================================================================= # GENERIC OVERRIDE (Highest Priority) # ============================================================================= # This applies to ANY provider, useful for quick testing # MCP_LLM_API_KEY=... # ============================================================================= # BROWSER CONFIGURATION (MCP_BROWSER_*) # ============================================================================= MCP_BROWSER_HEADLESS=true # Optional: Proxy server URL (e.g., http://host:8080) # MCP_BROWSER_PROXY_SERVER= # Optional: Comma-separated hosts to bypass proxy # MCP_BROWSER_PROXY_BYPASS= # ============================================================================= # AGENT CONFIGURATION (MCP_AGENT_*) # ============================================================================= MCP_AGENT_MAX_STEPS=20 MCP_AGENT_USE_VISION=true # ============================================================================= # DEEP RESEARCH CONFIGURATION (MCP_RESEARCH_*) # ============================================================================= MCP_RESEARCH_MAX_SEARCHES=5 # Optional: Directory to save research reports # MCP_RESEARCH_SAVE_DIRECTORY=./tmp/research MCP_RESEARCH_SEARCH_TIMEOUT=120 # ============================================================================= # SERVER CONFIGURATION (MCP_SERVER_*) # ============================================================================= MCP_SERVER_LOGGING_LEVEL=INFO # ============================================================================= # FASTMCP BACKGROUND TASKS (FASTMCP_*) # ============================================================================= # Background tasks allow long-running operations like deep research to run # asynchronously with progress tracking via the MCP task protocol. # Enable background task support (disabled by default) FASTMCP_ENABLE_TASKS=true # Task queue backend URL # Options: # memory:// - In-memory (default, ephemeral, single-process) # redis://host:port/db - Redis/Valkey (persistent, distributed) FASTMCP_DOCKET_URL=memory:// # Worker concurrency (for additional workers) # FASTMCP_DOCKET_CONCURRENCY=10

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Saik0s/mcp-browser-use'

If you have feedback or need assistance with the MCP directory API, please join our Discord server