Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GROQ_API_KEYNoGroq API key for cloud enrichment backend
BRAINLAYER_DBNoDatabase file path~/.local/share/brainlayer/brainlayer.db
BRAINLAYER_MLX_URLNoMLX server endpointhttp://127.0.0.1:8080/v1/chat/completions
BRAINLAYER_GROQ_URLNoGroq API endpointhttps://api.groq.com/openai/v1/chat/completions
BRAINLAYER_MLX_MODELNoMLX model identifiermlx-community/Qwen2.5-Coder-14B-Instruct-4bit
BRAINLAYER_GROQ_MODELNoGroq model for enrichmentllama-3.3-70b-versatile
BRAINLAYER_OLLAMA_URLNoOllama API endpointhttp://127.0.0.1:11434/api/generate
BRAINLAYER_ENRICH_MODELNoOllama model nameglm-4.7-flash
BRAINLAYER_STALL_TIMEOUTNoSeconds before killing a stuck enrichment chunk300
BRAINLAYER_ENRICH_BACKENDNoEnrichment LLM backend (mlx, ollama, or groq). Defaults to auto-detection (MLX → Ollama → Groq).
BRAINLAYER_HEARTBEAT_INTERVALNoLog progress every N chunks during enrichment25
BRAINLAYER_SANITIZE_USE_SPACYNoUse spaCy NER for PII detectiontrue
BRAINLAYER_SANITIZE_EXTRA_NAMESNoComma-separated names to redact from indexed content

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/EtanHey/brainlayer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server