Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
EMBED_DIMNoEmbedding vector dimension1536
LOG_LEVELNoDEBUG / INFO / WARNINGINFO
CHAT_MODELNoLLM for categorization + primer synthesisgpt-4o-mini
GITHUB_PATNoGitHub Personal Access Token with repo scope
DB_PASSWORDNoPostgreSQL password (same as in DATABASE_URL)
PG_POOL_MAXNoasyncpg max pool connections10
PG_POOL_MINNoasyncpg min pool connections1
DATABASE_URLYesPostgreSQL connection string
MCP_TRANSPORTNoFastMCP transport modestreamable-http
OPENAI_API_KEYYesOpenAI API key (for embeddings + LLM)
EMBEDDING_MODELNoOpenAI embedding modeltext-embedding-3-small
OPENAI_TIMEOUT_SNoPer-request OpenAI timeout (seconds)60
DEFAULT_LIST_LIMITNoDefault result count for list_categories50
GITHUB_BACKUP_REPONoTarget repo in owner/repo format (e.g. isaacriehm/memory-backup)
OPENAI_MAX_RETRIESNoExponential-backoff retry limit5
DEFAULT_SEARCH_LIMITNoDefault result count for search_memory10
FASTMCP_JSON_RESPONSENoSet to 1 to force JSON responses
BACKUP_INTERVAL_SECONDSNoSeconds between backups (default: 21600 = 6 hours)21600
PRIMER_UPDATE_MAX_AGE_SNoMax seconds before auto primer regeneration3600
MAX_CONCURRENT_API_CALLSNoSemaphore for parallel OpenAI calls5

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/isaacriehm/memory-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server