Skip to main content
Glama

Continuo Memory System

by GtOkAi

Continuo Memory System

Persistent memory and hierarchical compression for development environments

Python 3.9+ License: AGPL v3 Commercial License MCP Protocol

Overview

Continuo is a persistent memory system that provides semantic search and storage capabilities for development workflows. By separating reasoning (LLM) from long-term memory (Vector DB + hierarchical compression), the system maintains knowledge indefinitely, circumventing context window limitations.

Key Features

  • Persistent Memory - Store and retrieve development knowledge across sessions

  • Semantic Search - Find relevant information using natural language queries

  • Hierarchical Compression - N0 (chunks) → N1 (summaries) → N2 (meta-summaries)

  • MCP Integration - Seamless integration with IDEs via Model Context Protocol

  • Cost Effective - 100% local (free) or hybrid (low-cost) deployment options

  • FastMCP - Built on the modern MCP server framework

Quick Start

Installation

git clone https://github.com/GtOkAi/continuo-memory-mcp.git cd continuo ./scripts/setup_memory.sh

Usage

  1. Start the memory server:

./scripts/run_memory_server.sh
  1. Configure your IDE (Qoder/Cursor):

Create .qoder/mcp.json (or .cursor/mcp.json):

{ "mcpServers": { "continuo-memory": { "command": "/absolute/path/to/continuo/venv_memory/bin/python", "args": [ "/absolute/path/to/continuo/src/mcp/memory/mcp_memory_server.py", "--provider", "local", "--db-path", "/absolute/path/to/memory_db" ] } } }
  1. Use in your IDE:

@continuo-memory search_memory("authentication implementation") @continuo-memory store_memory("Fixed JWT validation bug", {"file": "auth.py"}) @continuo-memory get_memory_stats()

Architecture

IDE Chat ──► MCP Adapter ──► Memory Server ──► ChromaDB ▲ ▲ │ │ │ └──── tools ◄─────┘ │ └───── response ◄──── context ◄───────────────┘

Components

  • Memory Server - ChromaDB + sentence-transformers for embeddings

  • MCP Adapter - FastMCP server exposing search_memory and store_memory tools

  • Hierarchical Compression - Multi-level context optimization (N0/N1/N2)

  • Autonomous Mode - Optional automation with Observe → Plan → Act → Reflect cycle

Configuration

Local Embeddings (Free)

python src/mcp/memory/mcp_memory_server.py \ --provider local \ --db-path ./memory_db

OpenAI Embeddings (Low-cost)

python src/mcp/memory/mcp_memory_server.py \ --provider openai \ --api-key sk-your-key \ --db-path ./memory_db

API

Tools

search_memory(query: str, top_k: int = 5, level: str | None = None) -> str

  • Semantic search in persistent memory

  • Returns relevant documents with similarity scores

store_memory(text: str, metadata: dict | None = None, level: str = "N0") -> str

  • Store content in persistent memory

  • Supports metadata tagging and hierarchical levels

get_memory_stats() -> str

  • Get memory statistics (total documents, levels, etc.)

Hierarchical Levels

  • N0 - Raw chunks (code snippets, conversations)

  • N1 - Micro-summaries (5-10 chunks compressed)

  • N2 - Meta-summaries (5-10 summaries compressed)

Examples

See the examples/memory/ directory:

  • basic_usage.py - Simple store/retrieve operations

  • hierarchical_demo.py - Multi-level compression examples

  • auto_mode_demo.py - Autonomous mode demonstration

Documentation

Technology Stack

  • Python 3.9+ - Core implementation

  • ChromaDB - Vector database for embeddings

  • Sentence Transformers - Local embedding generation (all-MiniLM-L6-v2)

  • FastMCP - MCP server framework

  • Model Context Protocol - IDE integration standard

Cost & Licensing

Embedding Providers

Provider

Storage

Search

Monthly (1000 queries)

Local (sentence-transformers)

Free

Free

$0

OpenAI embeddings

Free

~$0.0001/query

~$0.10

Software License

Use Case

License

Cost

Individual/Research

AGPL v3

Free

Startup (<$1M, <10 employees)

AGPL v3

Free

Non-profit/Education

AGPL v3

Free

Commercial (≥$1M OR ≥10 employees)

Commercial

From $2,500/year

See COMMERCIAL_LICENSE.md for details.

Contributing

Contributions are welcome! Please read CONTRIBUTING.md for guidelines.

License

Continuo Memory System is dual-licensed:

📖 Open Source (AGPL v3)

FREE for:

  • ✅ Individual developers and researchers

  • ✅ Non-profit organizations and educational institutions

  • ✅ Companies with <$1M revenue AND <10 employees

  • ✅ Development, testing, and evaluation

  • ✅ Open source projects (AGPL-compatible)

Requirements: Share source code of modifications under AGPL v3

See LICENSE for full AGPL v3 terms.

💼 Commercial License

REQUIRED for:

  • ❌ Companies with ≥$1M revenue OR ≥10 employees

  • ❌ Proprietary/closed-source products

  • ❌ SaaS offerings without source disclosure

Benefits:

  • ✅ No AGPL copyleft obligations

  • ✅ Proprietary use rights

  • ✅ Priority support (optional)

  • ✅ Custom deployment assistance (optional)

Pricing: From $2,500/year (Bronze) to custom Enterprise

See COMMERCIAL_LICENSE.md for pricing and details.

💡 Why AGPL + Commercial?

  • Sustainable Development: Commercial users fund ongoing maintenance

  • Open Source Protection: AGPL prevents proprietary forks

  • Fair Use: Small teams and non-profits use free indefinitely

  • Community First: Core features always open source

Contact: licensing@continuo.dev (UPDATE) for commercial inquiries

Acknowledgments

Built using:

Authors

  • D.D. & Gustavo Porto


Note: This project implements the architecture described in continuo.markdown. For academic context and detailed specifications, refer to that document.

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables persistent memory and semantic search for development workflows with hierarchical compression. Store and retrieve development knowledge across IDE sessions using natural language queries, circumventing context window limitations.

  1. Overview
    1. Key Features
  2. Quick Start
    1. Installation
    2. Usage
  3. Architecture
    1. Components
  4. Configuration
    1. Local Embeddings (Free)
    2. OpenAI Embeddings (Low-cost)
  5. API
    1. Tools
    2. Hierarchical Levels
  6. Examples
    1. Documentation
      1. Technology Stack
        1. Cost & Licensing
          1. Embedding Providers
          2. Software License
        2. Contributing
          1. License
            1. 📖 Open Source (AGPL v3)
            2. 💼 Commercial License
            3. 💡 Why AGPL + Commercial?
          2. Acknowledgments
            1. Authors

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/GtOkAi/continuo-memory-mcp'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server