Skip to main content
Glama

Zen MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LOG_LEVELNoControls logging verbosity (DEBUG, INFO, WARNING, ERROR)INFO
DEFAULT_MODELNoThe default model to use (auto, pro, flash, o3, o3-mini, or a specific model name)auto
GEMINI_API_KEYNoYour Google AI Studio API key for accessing Gemini models
OPENAI_API_KEYNoYour OpenAI Platform API key for accessing O3 models
WORKSPACE_ROOTNoPath to your workspace directory, automatically configured in Docker setup
OPENROUTER_API_KEYNoYour OpenRouter API key for accessing multiple models through one API
TEMPERATURE_BALANCEDNoTemperature setting for general chat (balanced creativity/accuracy)0.5
TEMPERATURE_CREATIVENoTemperature setting for deep thinking and architecture (more creative)0.7
TEMPERATURE_ANALYTICALNoTemperature setting for code review and debugging (focused, deterministic)0.2

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/BeehiveInnovations/zen-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server