Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
.dockerignore751 B
node_modules build .git coverage testing .mcp-memory-store.json .env # Build artifacts (frontend/dist is pre-built locally to avoid Alpine ARM64 rollup bug) # frontend/dist - KEEP THIS, we copy pre-built frontend frontend/node_modules vscode-extension/dist vscode-extension/node_modules # Ignore markdown except README.md *.md !README.md # Development and runtime data data logs copilot-data ollama_models # IDE and editor files .vscode .idea *.swp *.swo *~ # OS files .DS_Store Thumbs.db # Generated agents generated-agents # Test files *.test.ts *.spec.ts __tests__ # Docker files (don't need in context) docker-compose*.yml Dockerfile* .dockerignore # Scripts not needed in container scripts deliverables docker/llama-cpp/models/* nornicdb

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server