Skip to main content
Glama

Continuo Memory System

by GtOkAi
README.md7.36 kB
<div align="center"> <img src="https://shigoto.me/continuo.jpeg" alt="Continuo"> </div> ## Continuo Memory System **Persistent memory and hierarchical compression for development environments** [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/) [![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](https://www.gnu.org/licenses/agpl-3.0.en.html) [![Commercial License](https://img.shields.io/badge/Commercial-Available-green.svg)](COMMERCIAL_LICENSE.md) [![MCP Protocol](https://img.shields.io/badge/MCP-2025--03--26-green.svg)](https://modelcontextprotocol.io) ## Overview Continuo is a **persistent memory system** that provides semantic search and storage capabilities for development workflows. By separating reasoning (LLM) from long-term memory (Vector DB + hierarchical compression), the system maintains knowledge indefinitely, circumventing context window limitations. ### Key Features - **Persistent Memory** - Store and retrieve development knowledge across sessions - **Semantic Search** - Find relevant information using natural language queries - **Hierarchical Compression** - N0 (chunks) → N1 (summaries) → N2 (meta-summaries) - **MCP Integration** - Seamless integration with IDEs via Model Context Protocol - **Cost Effective** - 100% local (free) or hybrid (low-cost) deployment options - **FastMCP** - Built on the modern MCP server framework ## Quick Start ### Installation ```bash git clone https://github.com/GtOkAi/continuo-memory-mcp-memory-mcp.git cd continuo ./scripts/setup_memory.sh ``` ### Usage 1. **Start the memory server:** ```bash ./scripts/run_memory_server.sh ``` 2. **Configure your IDE (Qoder/Cursor):** Create `.qoder/mcp.json` (or `.cursor/mcp.json`): ```json { "mcpServers": { "continuo-memory": { "command": "/absolute/path/to/continuo/venv_memory/bin/python", "args": [ "/absolute/path/to/continuo/src/mcp/memory/mcp_memory_server.py", "--provider", "local", "--db-path", "/absolute/path/to/memory_db" ] } } } ``` 3. **Use in your IDE:** ``` @continuo-memory search_memory("authentication implementation") @continuo-memory store_memory("Fixed JWT validation bug", {"file": "auth.py"}) @continuo-memory get_memory_stats() ``` ## Architecture ``` IDE Chat ──► MCP Adapter ──► Memory Server ──► ChromaDB ▲ ▲ │ │ │ └──── tools ◄─────┘ │ └───── response ◄──── context ◄───────────────┘ ``` ### Components - **Memory Server** - ChromaDB + sentence-transformers for embeddings - **MCP Adapter** - FastMCP server exposing `search_memory` and `store_memory` tools - **Hierarchical Compression** - Multi-level context optimization (N0/N1/N2) - **Autonomous Mode** - Optional automation with Observe → Plan → Act → Reflect cycle ## Configuration ### Local Embeddings (Free) ```bash python src/mcp/memory/mcp_memory_server.py \ --provider local \ --db-path ./memory_db ``` ### OpenAI Embeddings (Low-cost) ```bash python src/mcp/memory/mcp_memory_server.py \ --provider openai \ --api-key sk-your-key \ --db-path ./memory_db ``` ## API ### Tools **`search_memory(query: str, top_k: int = 5, level: str | None = None) -> str`** - Semantic search in persistent memory - Returns relevant documents with similarity scores **`store_memory(text: str, metadata: dict | None = None, level: str = "N0") -> str`** - Store content in persistent memory - Supports metadata tagging and hierarchical levels **`get_memory_stats() -> str`** - Get memory statistics (total documents, levels, etc.) ### Hierarchical Levels - **N0** - Raw chunks (code snippets, conversations) - **N1** - Micro-summaries (5-10 chunks compressed) - **N2** - Meta-summaries (5-10 summaries compressed) ## Examples See the [`examples/memory/`](examples/memory/) directory: - `basic_usage.py` - Simple store/retrieve operations - `hierarchical_demo.py` - Multi-level compression examples - `auto_mode_demo.py` - Autonomous mode demonstration ## Documentation - [Setup Guide](SETUP.md) - Detailed installation instructions - [Architecture Specification](continuo.markdown) - Complete technical documentation - [Code of Conduct](CODE_OF_CONDUCT.md) - Community guidelines ## Technology Stack - **Python 3.9+** - Core implementation - **ChromaDB** - Vector database for embeddings - **Sentence Transformers** - Local embedding generation (all-MiniLM-L6-v2) - **FastMCP** - MCP server framework - **Model Context Protocol** - IDE integration standard ## Cost & Licensing ### Embedding Providers | Provider | Storage | Search | Monthly (1000 queries) | |----------|---------|--------|------------------------| | Local (sentence-transformers) | Free | Free | $0 | | OpenAI embeddings | Free | ~$0.0001/query | ~$0.10 | ### Software License | Use Case | License | Cost | |----------|---------|------| | Individual/Research | AGPL v3 | **Free** | | Startup (<$1M, <10 employees) | AGPL v3 | **Free** | | Non-profit/Education | AGPL v3 | **Free** | | Commercial (≥$1M OR ≥10 employees) | Commercial | **From $2,500/year** | See [COMMERCIAL_LICENSE.md](COMMERCIAL_LICENSE.md) for details. ## Contributing Contributions are welcome! Please read [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. ## License Continuo Memory System is **dual-licensed**: ### 📖 Open Source (AGPL v3) **FREE for:** - ✅ Individual developers and researchers - ✅ Non-profit organizations and educational institutions - ✅ Companies with **<$1M revenue** AND **<10 employees** - ✅ Development, testing, and evaluation - ✅ Open source projects (AGPL-compatible) **Requirements**: Share source code of modifications under AGPL v3 See [LICENSE](LICENSE) for full AGPL v3 terms. ### 💼 Commercial License **REQUIRED for:** - ❌ Companies with **≥$1M revenue** OR **≥10 employees** - ❌ Proprietary/closed-source products - ❌ SaaS offerings without source disclosure **Benefits:** - ✅ No AGPL copyleft obligations - ✅ Proprietary use rights - ✅ Priority support (optional) - ✅ Custom deployment assistance (optional) **Pricing**: From **$2,500/year** (Bronze) to custom Enterprise See [COMMERCIAL_LICENSE.md](COMMERCIAL_LICENSE.md) for pricing and details. ### 💡 Why AGPL + Commercial? - **Sustainable Development**: Commercial users fund ongoing maintenance - **Open Source Protection**: AGPL prevents proprietary forks - **Fair Use**: Small teams and non-profits use free indefinitely - **Community First**: Core features always open source **Contact**: gustavo@shigoto.me for commercial inquiries ## Acknowledgments Built using: - [Model Context Protocol](https://modelcontextprotocol.io) - Protocol specification - [MCP Python SDK](https://github.com/modelcontextprotocol/python-sdk) - MCP implementation - [ChromaDB](https://www.trychroma.com/) - Vector database - [Sentence Transformers](https://www.sbert.net/) - Embedding models ## Authors - **D.D. & Gustavo Porto** --- **Note**: This project implements the architecture described in `continuo.markdown`. For academic context and detailed specifications, refer to that document.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/GtOkAi/continuo-memory-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server