Skip to main content
Glama

memorix_store

Store and index observations like code decisions, bug fixes, and insights across projects. Classify by type, track progress, and persist memories locally to share knowledge across multiple IDEs.

Instructions

Store a new observation/memory. Automatically indexed for search. Use type to classify: gotcha (πŸ”΄ critical pitfall), decision (🟀 architecture choice), problem-solution (🟑 bug fix), how-it-works (πŸ”΅ explanation), what-changed (🟒 change), discovery (🟣 insight), why-it-exists (🟠 rationale), trade-off (βš–οΈ compromise), session-request (🎯 original goal). Stored memories persist across sessions and are shared with other IDEs (Cursor, Windsurf, Claude Code, Codex, Copilot, Kiro, Antigravity, Trae) via the same local data directory.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
entityNameYesThe entity this observation belongs to (e.g., "auth-module", "port-config")
typeYesObservation type for classification
titleYesShort descriptive title (~5-10 words)
narrativeYesFull description of the observation
factsNoStructured facts (e.g., "Default timeout: 60s")
filesModifiedNoFiles involved
conceptsNoRelated concepts/keywords
topicKeyNoOptional topic identifier for upserts (e.g., "architecture/auth-model"). If an observation with the same topicKey already exists in this project, it will be UPDATED instead of creating a new one. Use memorix_suggest_topic_key to generate a stable key. Good for evolving decisions, architecture docs, etc.
progressNoProgress tracking for task/feature observations

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AVIDS2/memorix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server