Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
HTTP_PROXYNoCorporate HTTP proxy URL (if needed)
HTTPS_PROXYNoCorporate HTTPS proxy URL (if needed)
MIMIR_LLM_APINoBase URL for the LLM API endpointhttp://copilot-api:4141
SSL_CERT_FILENoCustom CA certificate file path (if needed)
MIMIR_PM_MODELNoOptional per-agent model override for PM agent (defaults to MIMIR_DEFAULT_MODEL)
MIMIR_QC_MODELNoOptional per-agent model override for QC agent (defaults to MIMIR_DEFAULT_MODEL)
NEO4J_PASSWORDNoNeo4j database password (change in production!)password
MIMIR_LLM_API_KEYNoOptional API key for LLM provider (required for OpenAI API)dummy-key
MIMIR_LLM_API_PATHNoOptional path for LLM API chat completions endpoint/v1/chat/completions
MIMIR_WORKER_MODELNoOptional per-agent model override for Worker agent (defaults to MIMIR_DEFAULT_MODEL)
HOST_WORKSPACE_ROOTNoYour main source code directory. This gives Mimir access to your code for file indexing. Tilde (~) automatically expands to your home directory.~/src
MIMIR_DEFAULT_MODELNoDefault LLM model to usegpt-4.1
MIMIR_EMBEDDINGS_APINoEmbeddings API endpointhttp://llama-server:8080
MIMIR_AUTO_INDEX_DOCSNoAuto-index Mimir documentation on startup. Allows users to immediately query Mimir's docs via semantic search.true
MIMIR_DEFAULT_PROVIDERNoLLM provider selection. Options: openai, copilot, ollama, llama.cppopenai
MIMIR_EMBEDDINGS_MODELNoEmbedding model for semantic search. Options: bge-m3, nomic-embed-text, text-embedding-3-smallbge-m3
MIMIR_EMBEDDINGS_ENABLEDNoEnable vector embeddings for AI semantic searchtrue
MIMIR_EMBEDDINGS_API_PATHNoOptional path for embeddings API endpoint/v1/embeddings
MIMIR_LLM_API_MODELS_PATHNoOptional path for LLM API models endpoint/v1/models
MIMIR_EMBEDDINGS_CHUNK_SIZENoChunk size for embeddings768
MIMIR_EMBEDDINGS_DIMENSIONSNoEmbedding vector dimensions1024
MIMIR_FEATURE_VECTOR_EMBEDDINGSNoEnable vector embeddings feature flagtrue

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server