Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
CE_METRICSNoEnable in-process metrics collection (Prometheus format)false
AUGMENT_API_URLNoAuggie API URLhttps://api.augmentcode.com
CE_HTTP_METRICSNoExpose GET /metrics when running with --httpfalse
REACTIVE_ENABLEDNoEnable reactive review featuresfalse
AUGMENT_API_TOKENYesAuggie API token (obtained via 'auggie login' or from the dashboard)
CE_TSC_INCREMENTALNoEnable incremental tsc runs for static analysistrue
CE_INDEX_STATE_STORENoPersist per-file index hashes to .augment-index-state.jsonfalse
CE_SEMGREP_MAX_FILESNoMax files per semgrep invocation before chunking100
CE_TSC_BUILDINFO_DIRNoDirectory to store tsbuildinfo cache (defaults to OS temp)
CE_HASH_NORMALIZE_EOLNoNormalize CRLF/LF when hashing (recommended with state store across Windows/Linux)false
REACTIVE_PARALLEL_EXECNoEnable concurrent worker executionfalse
CE_HTTP_PLAN_TIMEOUT_MSNoHTTP POST /api/v1/plan request timeout in milliseconds360000
CE_AI_REQUEST_TIMEOUT_MSNoDefault timeout for AI calls (searchAndAsk) in milliseconds120000
REACTIVE_ENABLE_BATCHINGNoEnable request batching (Phase 3)false
REACTIVE_OPTIMIZE_WORKERSNoEnable CPU-aware worker optimization (Phase 4)false
CE_SKIP_UNCHANGED_INDEXINGNoSkip re-indexing unchanged files (requires CE_INDEX_STATE_STORE=true)false
CE_SEARCH_AND_ASK_QUEUE_MAXNoMax queued searchAndAsk requests before rejecting (0 = unlimited)50
CONTEXT_ENGINE_OFFLINE_ONLYNoEnforce offline-only policy. When enabled, the server will fail to start if a remote API URL is configured.false
CE_PLAN_AI_REQUEST_TIMEOUT_MSNoTimeout for planning AI calls in milliseconds (create_plan, refine_plan, step execution)300000
REACTIVE_USE_AI_AGENT_EXECUTORNoUse local AI agent for reviews (Phase 1)false
REACTIVE_ENABLE_MULTILAYER_CACHENoEnable 3-layer caching (Phase 2)false

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Kirachon/context-engine'

If you have feedback or need assistance with the MCP directory API, please join our Discord server