Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
env.ollama829 B
# Ollama Configuration Example # Copy to project root as .env to use: cp docs/examples/env.ollama .env # LLM API (Ollama - OpenAI-compatible chat endpoint) MIMIR_LLM_API=http://ollama:11434 MIMIR_LLM_API_PATH=/v1/chat/completions MIMIR_LLM_API_MODELS_PATH=/v1/models MIMIR_LLM_API_KEY=dummy-key # Embeddings API (Ollama - Native API format) MIMIR_EMBEDDINGS_API=http://ollama:11434 MIMIR_EMBEDDINGS_API_PATH=/api/embeddings MIMIR_EMBEDDINGS_API_MODELS_PATH=/api/tags MIMIR_EMBEDDINGS_API_KEY=dummy-key # Provider Configuration MIMIR_DEFAULT_PROVIDER=ollama MIMIR_DEFAULT_MODEL=qwen2.5-coder:14b # Embeddings Configuration MIMIR_EMBEDDINGS_ENABLED=true MIMIR_EMBEDDINGS_PROVIDER=ollama MIMIR_EMBEDDINGS_MODEL=mxbai-embed-large MIMIR_EMBEDDINGS_DIMENSIONS=1024 MIMIR_EMBEDDINGS_CHUNK_SIZE=768 MIMIR_EMBEDDINGS_CHUNK_OVERLAP=10

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server