# Ember MCP
> Persistent tiered memory for LLM agents — powered by HESTIA scoring and Shadow-Decay.
Ember MCP is a Model Context Protocol (MCP) server that provides LLMs with long-term memory across sessions and clients. It runs entirely locally on your machine — no cloud, no API keys. Ember uses a 4-tier memory architecture (working → session → relational → glacier) with automatic promotion, TTL-based expiry, and the Shadow-Decay framework to prevent retrieval of stale, contradicted knowledge.
## Installation
```
pip install git+https://github.com/Arkya-AI/ember-mcp.git
```
Then add to your MCP client config (Claude Desktop / Claude Code):
```json
{
"mcpServers": {
"ember": {
"command": "ember-mcp",
"args": []
}
}
}
```
Optional: for vector/semantic search, also run `pip install "ember-mcp[semantic]"`.
## Key Features
- 4-tier memory: working, session, relational, glacier — with auto-promotion and TTL expiry
- Cross-session memory across all MCP clients
- 100% local — no API keys, no cloud, ~300MB disk (semantic search optional), ~150MB RAM
- HESTIA scoring: combines semantic similarity, shadow load, and topic vitality
- Shadow-Decay: newer memories shadow older similar ones, preventing stale retrieval
- FTS5 full-text search fallback when semantic search returns empty
- CORAL checkpointing: save/resume task state, detect cognitive overload
- Consolidation engine: auto-promotes frequently accessed memories
- Partial ID support: 8-char short IDs work for all mutation tools
- Knowledge graph with semantic edges and BFS traversal
- Source linking for traceable decisions
## How It Works
1. Storage: SQLite with FTS5 full-text search (`~/.ember-v3/ember.db`)
2. 4-tier TTL model: working (2h) → session (24h) → relational (30d) → glacier (permanent)
3. Optional semantic search via sentence-transformers (all-MiniLM-L6-v2, 384-dim, CPU)
4. Shadow-Decay: φ(mᵢ|mⱼ) assigns shadow load to older semantically-similar memories
5. HESTIA scoring: S = cos_sim · (1-Φ)^γ · vitality_factor ranks retrieval results
6. CORAL checkpointing: task state snapshots with overload detection
7. Consolidation: frequently accessed session memories auto-promote to relational tier
## 19 MCP Tools
### Store & Recall
- `ember_store` — Save a named memory with importance level, tags, tier, and graph edges
- `ember_recall` — Semantic + FTS5 search ranked by HESTIA score
- `ember_deep_recall` — Recall + read source files for full context
- `ember_learn` — Auto-capture facts, preferences, decisions from conversation
### Management
- `ember_list` — List stored memories, filter by tag
- `ember_delete` — Remove a memory by ID (supports 8-char short IDs)
- `ember_contradict` — Mark outdated memory stale, store corrected version
- `ember_read` — Read full content of a specific memory by ID
- `ember_set_status` — Update task status (open/in_progress/done)
### Intelligence
- `ember_auto` — Auto-retrieve relevant context at conversation start
- `ember_inspect` — View Voronoi cell distribution and conflict density
- `ember_save_session` — Save session summary with decisions, next steps, source linking
- `ember_drift_check` — Analyze knowledge region health, flag drifting/silent regions
- `ember_actionable` — Return embers with active status (open/in_progress)
### Advanced Analysis
- `ember_graph_search` — Vector search + BFS traversal via knowledge graph edges
- `ember_health` — Hallucination risk score with trend tracking
- `ember_recompute_shadows` — Full recalculation of shadow loads after migration
- `ember_explain` — HESTIA score breakdown for a specific memory
- `ember_compact` — AI-powered compaction of stale/shadowed memories
## Storage
All data stored locally at `~/.ember-v3/ember.db` (single SQLite file).
## Requirements
- Python 3.11+
- macOS, Linux, Windows (WSL)
## Links
- GitHub: https://github.com/Arkya-AI/ember-mcp
- Website: https://ember.timolabs.dev
- License: MIT
- Author: Timo Labs (https://github.com/Arkya-AI)