Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
Dockerfile287 B
FROM ollama/ollama:latest # Set the embedding model to pull (can be overridden at build time) ARG EMBEDDING_MODEL=nomic-embed-text # Start Ollama server in the background and pull the model RUN ollama serve & \ sleep 5 && \ ollama pull ${EMBEDDING_MODEL} && \ pkill ollama

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server