Skip to main content
Glama
remembra-ai

Remembra

by remembra-ai

πŸš€ What's New in v0.8.2

  • πŸ” AES-256-GCM Field Encryption β€” Encrypt memory content at rest with OWASP-compliant key derivation

  • πŸ›‘οΈ Enterprise Security Suite β€” PII detection, anomaly monitoring, audit logging

  • πŸ“¦ MCP Registry Published β€” Discoverable as io.github.remembra-ai/remembra in Claude Desktop

  • ⚑ One-Command Quick Start β€” curl | bash zero-config setup with Ollama embeddings

  • πŸ”Œ Multi-Provider Support β€” OpenAI, Anthropic Claude, Ollama for embeddings & entity extraction

  • πŸ“Š Usage Warning Banners β€” API responses include usage thresholds at 60/80/95%


The Problem

Every AI app needs memory. Your chatbot forgets users between sessions. Your agent can't recall decisions from yesterday. Your assistant asks the same questions over and over.

The current solutions suck:

  • Mem0: $249/mo for graph features, self-hosting docs are trash

  • Zep: Academic, complex to deploy

  • Letta: Research-grade, not production-ready

  • LangChain Memory: Too basic, no persistence

The Solution

from remembra import Memory

memory = Memory(user_id="user_123")

# Store β€” entities and facts extracted automatically
memory.store("Had a meeting with Sarah from Acme Corp. She prefers email over Slack.")

# Recall β€” semantic search finds relevant memories
result = memory.recall("How should I contact Sarah?")
print(result.context)
# β†’ "Sarah from Acme Corp prefers email over Slack."

# It knows "Sarah" and "Acme Corp" are entities. It builds relationships.
# It persists across sessions, reboots, context windows. Forever.

⚑ Quick Start (2 Minutes)

One Command Install

curl -sSL https://raw.githubusercontent.com/remembra-ai/remembra/main/quickstart.sh | bash

That's it. Remembra + Qdrant + Ollama start locally. No API keys needed.

Or with Docker Compose directly:

git clone https://github.com/remembra-ai/remembra && cd remembra
docker compose -f docker-compose.quickstart.yml up -d

Try it:

# Store a memory
curl -X POST http://localhost:8787/api/v1/memories/store \
  -H "Content-Type: application/json" \
  -d '{"content": "Alice is CEO of Acme Corp", "user_id": "demo"}'

# Recall it
curl -X POST http://localhost:8787/api/v1/memories/recall \
  -H "Content-Type: application/json" \
  -d '{"query": "Who runs Acme?", "user_id": "demo"}'

Connect to Claude (MCP)

Claude Desktop β€” add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "remembra": {
      "command": "remembra-mcp",
      "env": {
        "REMEMBRA_URL": "http://localhost:8787",
        "REMEMBRA_USER_ID": "default"
      }
    }
  }
}

Claude Code:

claude mcp add remembra -e REMEMBRA_URL=http://localhost:8787 -- remembra-mcp

Cursor β€” add to .cursor/mcp.json:

{
  "mcpServers": {
    "remembra": {
      "command": "remembra-mcp",
      "env": {
        "REMEMBRA_URL": "http://localhost:8787"
      }
    }
  }
}

Now ask Claude: "Remember that Alice is CEO of Acme Corp" β€” then later: "Who runs Acme?"

Python SDK

pip install remembra
from remembra import Memory

memory = Memory(user_id="user_123")
memory.store("Had a meeting with Sarah from Acme Corp. She prefers email over Slack.")
result = memory.recall("How should I contact Sarah?")
print(result.context)  # "Sarah from Acme Corp prefers email over Slack."

TypeScript SDK

npm install remembra
import { Remembra } from 'remembra';

const memory = new Remembra({ url: 'http://localhost:8787' });
await memory.store('User prefers dark mode');
const result = await memory.recall('preferences');

πŸ”₯ Why Remembra?

Feature Comparison

Feature

Remembra

Mem0

Zep/Graphiti

Letta

Engram

One-Command Install

βœ… curl | bash

βœ… pip

βœ… pip

⚠️ Complex

βœ… brew

Entity Resolution

βœ… Free

πŸ’° $249/mo

βœ…

❌

❌

Conflict Detection

βœ… Unique

❌

❌

❌

❌

PII Detection

βœ… Built-in

❌

❌

❌

❌

Hybrid Search

βœ… BM25+Vector

❌

βœ…

❌

❌

6 Embedding Providers

βœ… Hot-swap

❌ (1-2)

❌ (1)

❌

❌

Plugin System

βœ…

❌

❌

βœ…

❌

Sleep-Time Compute

βœ…

❌

❌

βœ…

❌

Self-Host + Billing

βœ… Stripe

❌

❌

❌

❌

Memory Spaces

βœ… Multi-tenant

❌

❌

❌

❌

MCP Server

βœ… Native

βœ…

❌

❌

βœ…

Pricing

Free / $49 / $99

$19 β†’ $249

$25+

Free

Free

License

MIT

Apache 2.0

Apache 2.0

Apache 2.0

MIT

Core Features

🧠 Smart Extraction β€” LLM-powered fact extraction from raw text

πŸ‘₯ Entity Resolution β€” "Adam", "Mr. Smith", "my husband" β†’ same person

⏱️ Temporal Memory β€” TTL, decay curves, historical queries

πŸ” Hybrid Search β€” Semantic + keyword for accurate recall

πŸ”’ Security β€” PII detection, anomaly monitoring, audit logs

πŸ“Š Dashboard β€” Visual memory browser, entity graphs, analytics


πŸ“Š Benchmark Results

Tested on the LoCoMo benchmark (Snap Research, ACL 2024) β€” the standard academic benchmark for AI memory systems.

Category

Accuracy

Questions

Single-hop (direct recall)

100%

37

Multi-hop (cross-session reasoning)

100%

32

Temporal (time-based queries)

100%

13

Open-domain (world knowledge + memory)

100%

70

Overall (memory categories)

100%

152

Scored with LLM judge (GPT-4o-mini). Adversarial detection not yet implemented. Run your own: python benchmarks/locomo_runner.py --data /tmp/locomo/data/locomo10.json


πŸ“– Documentation

Resource

Description

Quick Start

Get running in minutes

Python SDK

Full Python reference

TypeScript SDK

JavaScript/TypeScript guide

MCP Server

Tool reference + setup guides for 9 tools

REST API

API reference

Self-Hosting

Docker deployment guide


πŸ› οΈ MCP Server

Give any AI coding tool persistent memory with one command. Works with Claude Code, Cursor, VS Code + Copilot, Windsurf, JetBrains, Zed, OpenAI Codex, and any MCP-compatible client.

pip install remembra[mcp]
claude mcp add remembra -e REMEMBRA_URL=http://localhost:8787 -- remembra-mcp

Available Tools:

Tool

Description

store_memory

Save facts, decisions, context

recall_memories

Semantic search across memories

forget_memories

GDPR-compliant deletion

ingest_conversation

Auto-extract from chat history

health_check

Verify connection


πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    Your Application                          β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ Python   β”‚ TypeScript   β”‚ MCP Server (Claude/Cursor)        β”‚
β”‚ SDK      β”‚ SDK          β”‚ remembra-mcp                      β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                   Remembra REST API                          β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Extraction  β”‚   Entities   β”‚   Retrieval   β”‚   Security    β”‚
β”‚  (LLM)       β”‚  (Graph)     β”‚ (Hybrid)      β”‚  (PII/Audit)  β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                    Storage Layer                             β”‚
β”‚         Qdrant (vectors) + SQLite (metadata/graph)          β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

🀝 Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

# Clone
git clone https://github.com/remembra-ai/remembra
cd remembra

# Install dev dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Start dev server
remembra-server --reload

πŸ“„ License

MIT License β€” Use it however you want.


⭐ Star History

If Remembra helps you, please star the repo! It helps others discover the project.

Star History Chart


-
security - not tested
-
license - not tested
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/remembra-ai/remembra'

If you have feedback or need assistance with the MCP directory API, please join our Discord server