Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
MCP_MODEYesRequired for Claude Desktop. Set to 'stdio' to ensure only JSON-RPC messages are sent to stdout, preventing debug logs from interfering with the protocol.stdio
LOG_LEVELNoThe logging level for the server (e.g., 'error', 'info', 'debug').error
N8N_API_KEYNoYour n8n API key, required for workflow management tools (create, update, execute).
N8N_API_URLNoThe URL of your n8n instance (e.g., https://your-n8n-instance.com or http://host.docker.internal:5678).
N8N_MCP_LLM_MODELNoModel name for local LLM integration to enable AI-powered documentation summaries.
N8N_MCP_LLM_BASE_URLNoBase URL for local LLM integration to enable AI-powered documentation summaries.
WEBHOOK_SECURITY_MODENoSet to 'moderate' to allow webhooks to your local n8n instance while still blocking private networks and cloud metadata.
DISABLE_CONSOLE_OUTPUTNoSet to 'true' to prevent debug logs from interfering with the protocol.true
SQLJS_SAVE_INTERVAL_MSNoControls how long to wait after database changes before saving to disk when using the sql.js fallback (Default: 5000ms).5000
N8N_MCP_TELEMETRY_DISABLEDNoSet to 'true' to opt out of anonymous usage statistics.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/theblockchainbaby/caipher-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server