engram-mcp provides persistent semantic memory for AI agents, storing and retrieving memories across sessions using SQLite and local Ollama embeddings (nomic-embed-text) — no API keys or cloud services required.
Store memories
remember: Save session memory entries with automatic embedding and graph extractionremember_user: Store user-scoped memories (facts, preferences) that persist across all sessions
Retrieve memories
recall: Semantic search across session memories (falls back to keyword search); can blend in cross-session user facts viauserIdrecall_user: Search or retrieve all user-scoped memories from any session contexthistory: Get recent conversation history for a session in chronological ordergraph: Query extracted entity relationships and source memories (requiresENGRAM_GRAPH=1)
Manage memories
forget: Delete session memories — all, by ID, or before a dateforget_user: Delete user-scoped memories by ID, date, or all at onceconsolidate: Compress old working session memories into dense long-term summaries via LLM, archiving originals (supports dry-run preview)consolidate_user: Consolidate user-scoped working memories into long-term summaries
Monitor memories
stats: Session memory counts broken down by role, tier (working/long_term/archived), and graph datauser_stats: Memory statistics for a specific user across all sessions
@cartisien/engram-mcp
Persistent semantic memory for AI agents — MCP server powered by @cartisien/engram
Give any MCP-compatible AI client (Claude Desktop, Cursor, Windsurf) persistent memory that survives across sessions.
npx -y @cartisien/engram-mcpWhat it does
Exposes 5 tools to any MCP client:
Tool | Description |
| Store a memory with automatic embedding |
| Semantic search across stored memories |
| Recent conversation history |
| Delete one memory, a session, or entries before a date |
| Memory statistics for a session |
Memories are stored in SQLite. Semantic search uses local Ollama embeddings (nomic-embed-text) — no API key, no cloud. Falls back to keyword search if Ollama isn't available.
Quick Start
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"engram": {
"command": "npx",
"args": ["-y", "@cartisien/engram-mcp"],
"env": {
"ENGRAM_DB": "~/.engram/memory.db"
}
}
}
}Restart Claude Desktop. You'll see remember, recall, history, forget, and stats available as tools.
Cursor / Windsurf
Add to your MCP config:
{
"mcpServers": {
"engram": {
"command": "npx",
"args": ["-y", "@cartisien/engram-mcp"]
}
}
}Configuration
Env Var | Default | Description |
|
| SQLite database path |
|
| Ollama base URL for embeddings |
Local Embeddings (Recommended)
Install Ollama and pull the embedding model:
ollama pull nomic-embed-textSemantic search activates automatically. Without Ollama, keyword search is used.
Example Usage
Once connected, your agent can:
remember(sessionId="myagent", content="User prefers TypeScript over JavaScript", role="user")
recall(sessionId="myagent", query="what are the user's coding preferences?", limit=5)
# Returns: [{ content: "User prefers TypeScript...", similarity: 0.82 }, ...]
history(sessionId="myagent", limit=10)
stats(sessionId="myagent")
# { total: 42, byRole: { user: 20, assistant: 22 }, withEmbeddings: 42 }Part of the Cartisien Memory Suite
@cartisien/engram— core memory SDK@cartisien/engram-mcp— this package, MCP server@cartisien/extensa— vector infrastructure (coming soon)@cartisien/cogito— agent identity & lifecycle (coming soon)
MIT © Cartisien Interactive