Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LOG_LEVELNoLogging level for verbose output (e.g., DEBUG for detailed logs)
REDIS_URLNoRedis connection URLredis://localhost:6379
GEMINI_MODELNoThe Gemini model to use for state merging. Available models: gemini-3-pro-preview (latest), gemini-2.5-pro (stable), gemini-2.5-flash (fast)gemini-3-pro-preview
PROJECT_ROOTNoThe project root directory. Use ${workspaceFolder} to automatically resolve to your current project directory
GEMINI_API_KEYYesYour Google Gemini API key (required for AI-powered state merging). Get from https://aistudio.google.com/apikey
ANTHROPIC_API_KEYNoYour Anthropic API key (optional - used as fallback if Gemini fails). Get from https://console.anthropic.com/

Tools

Functions exposed to the LLM to take actions

NameDescription
checkpoint

Save current context to Redis before running /clear. Merges new context with existing project state using LLM-based summarization.

resume

Load the last checkpoint at session start. Returns formatted context to inject into the conversation.

rollback

Revert to a previous checkpoint version. Useful if a merge produced incorrect results.

status

Show current state metadata including version, active files, tasks, token usage, and checkpoint history.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/coderdeep11/claude-memory-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server