Skip to main content
Glama

šŸš€ What's New in v0.13.0

Dashboard v2.0

  • šŸ” Two-Factor Authentication — TOTP-based 2FA with authenticator apps

  • šŸ‘„ Team Collaboration — Shared memory spaces with role-based access

  • šŸ› ļø Admin Dashboard — Full user management (delete/deactivate/reset)

  • šŸ“Š Activity Log — Security audit trail with JSON export

  • šŸ•µļø Entity Browser — Visual exploration of people, places, concepts

  • ā° Timeline Fix — Proper timezone handling with local time display

Core API

  • šŸ“¦ npm Package — npm install remembra with full TypeScript support

  • šŸ”’ Security Fixes — RBAC enforcement, SSRF protection, error sanitization

Supported Agents (6+)

Claude Desktop • Claude Code • Codex CLI • Cursor • Windsurf • Gemini

Previous (v0.12.x)

  • šŸ‘¤ User Profiles API with activity metrics

  • 🧠 Smart Auto-Forgetting (35+ temporal patterns)

  • ā° Event-driven expiry with expires_at

  • 🌐 Browser Extension for AI chat interfaces


The Problem

Every AI app needs memory. Your chatbot forgets users between sessions. Your agent can't recall decisions from yesterday. Your assistant asks the same questions over and over.

Existing solutions have tradeoffs:

  • Mem0: Graph features require $249/mo plan; limited self-hosting documentation

  • Zep: Academic approach, complex deployment

  • Letta: Research-grade, not production-ready

  • LangChain Memory: Too basic, no persistence

The Solution

from remembra import Memory

memory = Memory(user_id="user_123")

# Store — entities and facts extracted automatically
memory.store("Had a meeting with Sarah from Acme Corp. She prefers email over Slack.")

# Recall — semantic search finds relevant memories
result = memory.recall("How should I contact Sarah?")
print(result.context)
# → "Sarah from Acme Corp prefers email over Slack."

# It knows "Sarah" and "Acme Corp" are entities. It builds relationships.
# It persists across sessions, reboots, context windows. Forever.

⚔ Quick Start (2 Minutes)

One Command Install

curl -sSL https://raw.githubusercontent.com/remembra-ai/remembra/main/quickstart.sh | bash

That's it. Remembra + Qdrant + Ollama start locally. No API keys needed.

Or with Docker Compose directly:

git clone https://github.com/remembra-ai/remembra && cd remembra
docker compose -f docker-compose.quickstart.yml up -d

Try it:

# Store a memory
curl -X POST http://localhost:8787/api/v1/memories \
  -H "Content-Type: application/json" \
  -d '{"content": "Alice is CEO of Acme Corp", "user_id": "demo"}'

# Recall it
curl -X POST http://localhost:8787/api/v1/memories/recall \
  -H "Content-Type: application/json" \
  -d '{"query": "Who runs Acme?", "user_id": "demo"}'

Connect ALL Your AI Agents (NEW in v0.10.0)

One command configures everything:

pip install remembra
remembra-install --all --url http://localhost:8787

This auto-detects and configures: Claude Desktop, Claude Code, Codex CLI, Cursor, Windsurf, Gemini.

Verify setup:

remembra-doctor all

Claude Desktop — add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "remembra": {
      "command": "remembra-mcp",
      "env": {
        "REMEMBRA_URL": "http://localhost:8787",
        "REMEMBRA_USER_ID": "default"
      }
    }
  }
}

Claude Code:

claude mcp add remembra -e REMEMBRA_URL=http://localhost:8787 -- remembra-mcp

Cursor — add to .cursor/mcp.json:

{
  "mcpServers": {
    "remembra": {
      "command": "remembra-mcp",
      "env": {
        "REMEMBRA_URL": "http://localhost:8787"
      }
    }
  }
}

Now ask Claude: "Remember that Alice is CEO of Acme Corp" — then later: "Who runs Acme?"

Python SDK

pip install remembra
from remembra import Memory

memory = Memory(user_id="user_123")
memory.store("Had a meeting with Sarah from Acme Corp. She prefers email over Slack.")
result = memory.recall("How should I contact Sarah?")
print(result.context)  # "Sarah from Acme Corp prefers email over Slack."

TypeScript SDK

npm install remembra
import { Remembra } from 'remembra';

const memory = new Remembra({ url: 'http://localhost:8787' });
await memory.store('User prefers dark mode');
const result = await memory.recall('preferences');

šŸ”„ Why Remembra?

Feature Comparison

Feature

Remembra

Mem0

Zep/Graphiti

Letta

Engram

One-Command Install

āœ… curl | bash

āœ… pip

āœ… pip

āš ļø Complex

āœ… brew

Bi-Temporal Relationships

āœ… Point-in-time

āŒ

āš ļø Basic

āŒ

āŒ

Entity Resolution

āœ… Free

šŸ’° $249/mo

āœ…

āŒ

āŒ

Conflict Detection

āœ… Auto-supersede

āŒ

āŒ

āŒ

āŒ

PII Detection

āœ… Built-in

āŒ

āŒ

āŒ

āŒ

Hybrid Search

āœ… BM25+Vector

āŒ

āœ…

āŒ

āŒ

6 Embedding Providers

āœ… Hot-swap

āŒ (1-2)

āŒ (1)

āŒ

āŒ

Plugin System

āœ…

āŒ

āŒ

āœ…

āŒ

Sleep-Time Compute

āœ…

āŒ

āŒ

āœ…

āŒ

Self-Host + Billing

āœ… Stripe

āŒ

āŒ

āŒ

āŒ

Memory Spaces

āœ… Multi-tenant

āŒ

āŒ

āŒ

āŒ

MCP Server

āœ… 11 Tools

āœ…

āŒ

āŒ

āœ…

Pricing

Free / $49 / $199

$19 → $249

$25+

Free

Free

License

MIT

Apache 2.0

Apache 2.0

Apache 2.0

MIT

Core Features

🧠 Smart Extraction — LLM-powered fact extraction from raw text

šŸ‘„ Entity Resolution — "Adam", "Mr. Smith", "my husband" → same person

ā±ļø Temporal Memory — TTL, decay curves, historical queries

šŸ” Hybrid Search — Semantic + keyword for accurate recall

šŸ”’ Security — PII detection, anomaly monitoring, audit logs

šŸ“Š Dashboard — Visual memory browser, entity graphs, analytics


šŸ“Š Benchmark Results

Tested on the LoCoMo benchmark (Snap Research, ACL 2024) — the standard academic benchmark for AI memory systems.

Category

Accuracy

Questions

Single-hop (direct recall)

100%

37

Multi-hop (cross-session reasoning)

100%

32

Temporal (time-based queries)

100%

13

Open-domain (world knowledge + memory)

100%

70

Overall (memory categories)

100%

152

Scored with LLM judge (GPT-4o-mini). Adversarial detection not yet implemented. Run your own: python benchmarks/locomo_runner.py --data /tmp/locomo/data/locomo10.json


šŸ“– Documentation

Resource

Description

Quick Start

Get running in minutes

Python SDK

Full Python reference

TypeScript SDK

JavaScript/TypeScript guide

MCP Server

Tool reference + setup guides for 11 tools

REST API

API reference

Self-Hosting

Docker deployment guide


šŸ› ļø MCP Server

Give any AI coding tool persistent memory with one command. Works with Claude Code, Cursor, VS Code + Copilot, Windsurf, JetBrains, Zed, OpenAI Codex, and any MCP-compatible client.

pip install remembra[mcp]
claude mcp add remembra -e REMEMBRA_URL=http://localhost:8787 -- remembra-mcp

Available Tools (11 total):

Tool

Description

store_memory

Save facts, decisions, context

recall_memories

Semantic search across memories

update_memory

Update content without delete+recreate

forget_memories

GDPR-compliant deletion

list_memories

Browse stored memories

search_entities

Search the entity graph

share_memory

Cross-agent memory sharing via Spaces

timeline

Temporal browsing by entity and date

relationships_at

Point-in-time relationship queries

ingest_conversation

Auto-extract from chat history

health_check

Verify connection


šŸ—ļø Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│                    Your Application                          │
ā”œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¤
│ Python   │ TypeScript   │ MCP Server (Claude/Cursor)        │
│ SDK      │ SDK          │ remembra-mcp                      │
ā”œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¤
│                   Remembra REST API                          │
ā”œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¤
│  Extraction  │   Entities   │   Retrieval   │   Security    │
│  (LLM)       │  (Graph)     │ (Hybrid)      │  (PII/Audit)  │
ā”œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¤
│                    Storage Layer                             │
│         Qdrant (vectors) + SQLite (metadata/graph)          │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

šŸ¤ Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

# Clone
git clone https://github.com/remembra-ai/remembra
cd remembra

# Install dev dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Start dev server
remembra-server --reload

šŸ“„ License

MIT License — Use it however you want.


⭐ Star History

If Remembra helps you, please star the repo! It helps others discover the project.

Star History Chart


-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/remembra-ai/remembra'

If you have feedback or need assistance with the MCP directory API, please join our Discord server