Skip to main content
Glama
.env.example•900 B
# LLM API Configuration # Required: Your LLM API key LLM_API_KEY=your_api_key_here # Optional: Custom LLM base URL (defaults to OpenAI) LLM_BASE_URL=https://api.openai.com/v1 # Optional: LLM model to use (defaults to gpt-3.5-turbo) LLM_MODEL=gpt-3.5-turbo # Examples for different providers: # OpenAI (default) # LLM_API_KEY=sk-... # LLM_BASE_URL=https://api.openai.com/v1 # LLM_MODEL=gpt-3.5-turbo # Anthropic Claude # LLM_API_KEY=sk-ant-... # LLM_BASE_URL=https://api.anthropic.com/v1 # LLM_MODEL=claude-3-haiku-20240307 # Google Gemini # LLM_API_KEY=... # LLM_BASE_URL=https://generativelanguage.googleapis.com/v1beta # LLM_MODEL=gemini-pro # Z.AI GLM (Your setup) LLM_API_KEY=f6c31a4e2afe46568486b01aca6c13ef.YImYb7iTOmiU6wsM LLM_BASE_URL=https://api.z.ai/api/paas/v4/ LLM_MODEL=glm-4.6 # Local LLM (Ollama) # LLM_API_KEY=ollama # LLM_BASE_URL=http://localhost:11434/v1 # LLM_MODEL=llama2

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pythondev-pro/egw_writings_mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server