Supports cloud storage sync via Cloudflare R2 for storing and syncing the SQLite database, with support for encryption and compression.
Renders Mermaid diagrams within the interactive knowledge graph visualization when viewing memory content.
Supports S3-compatible storage sync via MinIO for storing and syncing the SQLite database.
Provides a Telescope plugin for browsing and searching memories directly in Neovim with fuzzy search and preview capabilities.
Supports OpenAI embeddings for high-quality semantic search and cross-referencing of memories using text-embedding models.
Uses SQLite as the persistent storage backend for memories, with support for local storage and cloud sync.
Memora
A lightweight Model Context Protocol (MCP) server that persists shared memories in SQLite. Compatible with Claude Code, Codex CLI, and other MCP-aware clients.
Features
Persistent Storage - SQLite-backed database with optional cloud sync (S3, GCS, Azure)
Semantic Search - Vector embeddings (TF-IDF, sentence-transformers, or OpenAI)
Event Notifications - Poll-based system for inter-agent communication
Advanced Queries - Full-text search, date ranges, tag filters (AND/OR/NOT)
Cross-references - Auto-linked related memories based on similarity
Hierarchical Organization - Explore memories by section/subsection
Export/Import - Backup and restore with merge strategies
Knowledge Graph - Interactive HTML visualization with filtering
Live Graph Server - Auto-starts HTTP server for remote access via SSH
Statistics & Analytics - Tag usage, trends, and connection insights
Zero Dependencies - Works out-of-box with Python stdlib (optional backends available)
Install
Usage
The server runs automatically when configured in Claude Code. Manual invocation:
Claude Code Config
Add to .mcp.json in your project root:
Local DB
Cloud DB (S3/R2)
Codex CLI Config
Add to ~/.codex/config.toml:
Environment Variables
Variable | Description |
| Local SQLite database path (default:
) |
| Cloud storage URI for S3/R2 (e.g.,
) |
| Encrypt database before uploading to cloud (
/
) |
| Compress database before uploading to cloud (
/
) |
| Local cache directory for cloud-synced database |
| Allow any tag without validation against allowlist (
to enable) |
| Path to file containing allowed tags (one per line) |
| Comma-separated list of allowed tags |
| Port for the knowledge graph visualization server (default:
) |
| Embedding backend:
(default),
, or
|
| Model for sentence-transformers (default:
) |
| API key for OpenAI embeddings (required when using
backend) |
| OpenAI embedding model (default:
) |
| AWS credentials profile from
(useful for R2) |
| S3-compatible endpoint for R2/MinIO |
| Public domain for R2 image URLs |
Semantic Search & Embeddings
Memora supports three embedding backends for semantic search:
Backend | Install | Quality | Speed |
(default) | None | Basic keyword matching | Fast |
|
| True semantic understanding | Medium |
|
| High quality | API latency |
Automatic: Embeddings and cross-references are computed automatically when you memory_create, memory_update, or memory_create_batch.
Manual rebuild required when:
Changing
MEMORA_EMBEDDING_MODELafter memories existSwitching to a different sentence-transformers model
Neovim Integration
Browse memories directly in Neovim with Telescope. Copy the plugin to your config:
Usage: Press <leader>sm to open the memory browser with fuzzy search and preview.
Requires: telescope.nvim, plenary.nvim, and memora installed in your Python environment.
Knowledge Graph Export
Export memories as an interactive HTML knowledge graph visualization:
Interactive vis.js graph with tag/section filtering, memory tooltips, Mermaid diagram rendering, and auto-resized image thumbnails. Click nodes to view content, drag to explore.
Live Graph Server
A built-in HTTP server starts automatically with the MCP server, serving the graph visualization on-demand.
Access locally:
Remote access via SSH:
Configuration:
Use different ports on different machines to avoid conflicts when forwarding multiple servers.
To disable: add "--no-graph" to args in your MCP config.