Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
DEFAULT_MODELNoSpecific model selection. Set to 'auto' for automatic selection or specify a model name (e.g., 'gemini-2.0-flash-exp')auto
CUSTOM_API_URLNoCustom API URL for local Ollama setup, vLLM, or other compatible providers
GEMINI_API_KEYNoGemini API key for Google AI models
OPENAI_API_KEYNoOpenAI API key for GPT models
OPENROUTER_API_KEYNoOpenRouter API key for multiple model providers

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/petems/genz-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server