Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
WARMUPNoPre-warm the index by opening key files
PROJECTYesPath to C++ project root (required)
TIMEOUTNoLSP request timeout5.0
LOG_LEVELNoLogging verbosity: debug, info, warning, error
AI_ENABLEDNoEnable AI features
INDEX_PATHNoCustom clangd index location
AI_PROVIDERNoAI provider: gemini-2.5-flash or gemini-2.5-flash-lite
WARMUP_LIMITNoNumber of files to warm up10
AI_CACHE_DAYSNoCache AI summaries for N days7
AI_COST_LIMITNoMonthly cost limit in USD10.0
INDEX_TIMEOUTNoTimeout for index wait300
WAIT_FOR_INDEXNoWait for clangd indexing to complete
AI_CONTEXT_LEVELNoCode context depth: minimal, local, or fullminimal
AI_ANALYSIS_LEVELNoDefault analysis depth: summary or detailedsummary
CALL_HIERARCHY_DEPTHNoMaximum depth (1-10)3
CLANGAROO_AI_API_KEYNoGoogle AI API key (can also be set via --ai-api-key)
CALL_HIERARCHY_MAX_CALLSNoTotal call limit100
CALL_HIERARCHY_PER_LEVELNoCalls per depth level25

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jasondk/clangaroo'

If you have feedback or need assistance with the MCP directory API, please join our Discord server