jamjet-labs/engram-mcp-server
Provides community support and discussion channel access for the Engram memory server through Discord integration.
Enables containerized deployment of the memory server with persistent storage and environment configuration for LLM providers.
Hosts source code, documentation, and container registry for the Engram server with version control and collaboration features.
Integrates with Google's Gemini via generateContent API for LLM-powered fact extraction and processing capabilities.
Mentioned as a comparison point in memory solutions documentation, though not a direct integration target of the MCP server.
Referenced as an alternative knowledge graph solution that Engram replaces with its SQLite/PostgreSQL-based temporal knowledge graph.
Provides default local LLM integration for fact extraction and processing without API keys, supporting offline memory operations.
Supports OpenAI-compatible APIs including OpenAI, Azure, Groq, Together, Mistral, and other services for LLM-powered memory operations.
Offers PostgreSQL database backend option for scalable, production-ready durable memory storage with hybrid retrieval capabilities.
Provides native Python client SDK (jamjet package with EngramClient) for programmatic interaction with the memory server.
Offers Rust core implementation with native client SDK (jamjet-engram crate) for embedding memory functionality directly.
Provides Spring Boot starter (dev.jamjet:engram-spring-boot-starter) for Java Spring applications to integrate with the memory server.
Offers Spring Boot integration through dedicated starter package for Java applications to use the durable memory layer.
Provides SQLite file backend for zero-infrastructure memory storage with hybrid semantic and keyword search capabilities.
Durable memory for AI agents — temporal knowledge graph, hybrid retrieval, SQLite or PostgreSQL.
java-ai-memory.dev · Source code · JamJet docs · Discord
Engram is a durable memory layer for AI agents. It extracts facts from conversations, stores them in a temporal knowledge graph, and retrieves them with hybrid semantic + keyword search — backed by a single SQLite file or a PostgreSQL database.
This repo hosts the Glama registry listing. Source code lives in the main JamJet repo.
Quickstart — 30 seconds
# Docker — uses local Ollama by default
docker run --rm -i \
-v engram-data:/data \
ghcr.io/jamjet-labs/engram-server:0.5.0Or install from crates.io:
cargo install jamjet-engram-server
engram serveClaude Desktop configuration
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"engram": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-v", "engram-data:/data",
"ghcr.io/jamjet-labs/engram-server:0.5.0"
]
}
}
}After restart, 11 MCP tools are available to the model.
MCP Tools (11)
Memory tools (7)
Tool | Description |
| Extract and store facts from conversation messages using LLM-powered fact extraction. Side effects: calls the configured LLM to parse facts, then writes them to the knowledge graph. Returns extracted fact IDs. Requires |
| Semantic search over stored facts using vector similarity. Read-only, no side effects. Returns ranked facts matching the query, scoped by |
| Assemble a token-budgeted context block for LLM prompts with tier-aware fact selection. Read-only. Returns a formatted string of the most relevant facts, capped at the specified token budget. Use this instead of memory_recall when you need a ready-to-use prompt snippet. |
| Keyword search over facts using full-text search (SQLite FTS5 / Postgres). Read-only, no side effects. Returns facts matching exact keywords. Use this when you need precise term matching rather than semantic similarity from memory_recall. |
| Soft-delete a fact by ID with an optional reason. Side effect: marks the fact as deleted in the knowledge graph (does not physically remove it). Irreversible via this tool. Use when a user asks to remove specific information. |
| Get aggregate statistics: total facts, valid (non-deleted) facts, entity count, and relationship count. Read-only, no side effects. Use this to understand the size and health of the memory store. |
| Run a maintenance cycle over the knowledge graph — decay stale facts, promote high-confidence ones, deduplicate near-duplicates, and summarize clusters. Side effects: modifies fact scores and may merge or archive facts. Run periodically to keep memory accurate. |
Message store tools (4)
Tool | Description |
| Save chat messages for a conversation by ID. Side effects: writes messages to the store and optionally triggers fact extraction (controlled by |
| Retrieve all messages for a conversation by ID. Read-only, no side effects. Returns the ordered message array. Use this to replay or inspect a past conversation. |
| List all conversation IDs in the message store. Read-only, no side effects. Returns an array of conversation ID strings. Use this to discover what conversations are stored before retrieving with messages_get. |
| Delete all messages for a conversation by ID. Side effect: permanently removes the conversation's messages from the store. Irreversible. Does not affect extracted facts — use memory_forget for that. |
All memory tools are scoped by (org_id, user_id, session_id) — org is the coarsest, session the finest.
LLM Providers
Provider-agnostic. One binary, set ENGRAM_LLM_PROVIDER=... and go:
Provider | Env value | Notes |
Ollama |
| Local, free, no API keys |
OpenAI-compatible |
| OpenAI, Azure, Groq, Together, Mistral, DeepSeek, vLLM, LM Studio, ... |
Anthropic |
| Claude via Messages API |
| Gemini via generateContent | |
Shell command |
| Pipe to any external script |
Mock |
| Deterministic, for tests only |
# Example: use Groq instead of Ollama
docker run --rm -i \
-e ENGRAM_LLM_PROVIDER=openai-compatible \
-e ENGRAM_OPENAI_BASE_URL=https://api.groq.com/openai/v1 \
-e OPENAI_API_KEY=gsk_... \
-v engram-data:/data \
ghcr.io/jamjet-labs/engram-server:0.5.0Why Engram?
Problem | Engram's answer |
Every agent memory library is Python-first | Rust core with native Python, Java, and MCP clients |
Needs Postgres + Qdrant + Neo4j just to try | Single SQLite file (zero infra) or Postgres when you need it |
Conversation history is not knowledge memory | Fact extraction pipeline — structured facts from messages |
Old facts drift and contradict | Conflict detection + consolidation — decay, promote, dedup, summarize |
Memory recall is either semantic OR keyword | Hybrid retrieval — vector search + FTS5 in one query |
MCP support is an afterthought | MCP-native — 11 tools exposed by a single binary |
Can't isolate memory per user or tenant | First-class scopes — org / user / session built into every query |
Client SDKs
Language | Package | Install |
Python |
|
|
Java |
| Maven Central |
Spring Boot |
| Maven Central |
Rust |
|
|
Related
JamJet — the full agent-native runtime (parent project)
java-ai-memory.dev — comparison with Mem0, Zep, LangChain4j, Spring AI, and others
License
Apache 2.0 — see LICENSE.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/jamjet-labs/engram-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server