MCP Memory Service
Universal MCP memory service providing semantic memory search and persistent storage for AI assistants. Works with Claude Desktop, VS Code, Cursor, Continue, and 13+ AI applications with SQLite-vec for fast local search and Cloudflare for global distribution.
🚀 Quick Start (2 minutes)
Universal Installer (Recommended)
Docker (Fastest)
Smithery (Claude Desktop)
⚠️ First-Time Setup Expectations
On your first run, you'll see some warnings that are completely normal:
- "WARNING: Failed to load from cache: No snapshots directory" - The service is checking for cached models (first-time setup)
- "WARNING: Using TRANSFORMERS_CACHE is deprecated" - Informational warning, doesn't affect functionality
- Model download in progress - The service automatically downloads a ~25MB embedding model (takes 1-2 minutes)
These warnings disappear after the first successful run. The service is working correctly! For details, see our First-Time Setup Guide.
🐍 Python 3.13 Compatibility Note
sqlite-vec may not have pre-built wheels for Python 3.13 yet. If installation fails:
- The installer will automatically try multiple installation methods
- Consider using Python 3.12 for the smoothest experience:
brew install python@3.12
- Alternative: Use ChromaDB backend with
--storage-backend chromadb
- See Troubleshooting Guide for details
🍎 macOS SQLite Extension Support
macOS users may encounter enable_load_extension
errors with sqlite-vec:
- System Python on macOS lacks SQLite extension support by default
- Solution: Use Homebrew Python:
brew install python && rehash
- Alternative: Use pyenv:
PYTHON_CONFIGURE_OPTS='--enable-loadable-sqlite-extensions' pyenv install 3.12.0
- Fallback: Use ChromaDB backend:
export MCP_MEMORY_STORAGE_BACKEND=chromadb
- See Troubleshooting Guide for details
📚 Complete Documentation
👉 Visit our comprehensive Wiki for detailed guides:
🚀 Setup & Installation
- 📋 Installation Guide - Complete installation for all platforms and use cases
- 🖥️ Platform Setup Guide - Windows, macOS, and Linux optimizations
- 🔗 Integration Guide - Claude Desktop, Claude Code, VS Code, and more
🧠 Advanced Topics
- 🧠 Advanced Configuration - Integration patterns, best practices, workflows
- ⚡ Performance Optimization - Speed up queries, optimize resources, scaling
- 👨💻 Development Reference - Claude Code hooks, API reference, debugging
🔧 Help & Reference
- 🔧 Troubleshooting Guide - Solutions for common issues
- ❓ FAQ - Frequently asked questions
- 📝 Examples - Practical code examples and workflows
✨ Key Features
🧠 Intelligent Memory Management
- Semantic search with vector embeddings
- Natural language time queries ("yesterday", "last week")
- Tag-based organization with smart categorization
- Memory consolidation with dream-inspired algorithms
🔗 Universal Compatibility
- Claude Desktop - Native MCP integration
- Claude Code - Memory-aware development with hooks
- VS Code, Cursor, Continue - IDE extensions
- 13+ AI applications - REST API compatibility
💾 Flexible Storage
- SQLite-vec - Fast local storage (recommended)
- ChromaDB - Multi-client collaboration
- Cloudflare - Global edge distribution
- Automatic backups and synchronization
🚀 Production Ready
- Cross-platform - Windows, macOS, Linux
- Service installation - Auto-start background operation
- HTTPS/SSL - Secure connections
- Docker support - Easy deployment
💡 Basic Usage
🔧 Configuration
Claude Desktop Integration
Add to your Claude Desktop config (~/.claude/config.json
):
Environment Variables
🏗️ Architecture
🛠️ Development
Project Structure
Contributing
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Submit a pull request
See CONTRIBUTING.md for detailed guidelines.
🆘 Support
- 📖 Documentation: Wiki - Comprehensive guides
- 🐛 Bug Reports: GitHub Issues
- 💬 Discussions: GitHub Discussions
- 🔧 Troubleshooting: Troubleshooting Guide
📊 In Production
Real-world metrics from active deployments:
- 750+ memories stored and actively used
- <500ms response time for semantic search
- 65% token reduction in Claude Code sessions
- 96.7% faster context setup (15min → 30sec)
- 100% knowledge retention across sessions
🏆 Recognition
- Verified MCP Server
- Featured AI Tool
- Production-tested across 13+ AI applications
- Community-driven with real-world feedback and improvements
📄 License
Apache License 2.0 - see LICENSE for details.
Ready to supercharge your AI workflow? 🚀
👉 Start with our Installation Guide or explore the Wiki for comprehensive documentation.
Transform your AI conversations into persistent, searchable knowledge that grows with you.
Related MCP Servers
- -securityFlicense-qualityProvides memory/knowledge graph storage capabilities using Supabase, enabling multiple Claude instances to safely share and maintain a knowledge graph with features like entity storage, concurrent access safety, and full text search.Last updated -6
- AsecurityAlicenseAqualityA memory server for Claude that stores and retrieves knowledge graph data in DuckDB, enhancing performance and query capabilities for conversations with persistent user information.Last updated -8645MIT License
- AsecurityAlicenseAqualityProvides intelligent transcript processing capabilities for Claude, featuring natural formatting, contextual repair, and smart summarization powered by Deep Thinking LLMs.Last updated -416MIT License
- AsecurityAlicenseAqualityA persistent memory layer for Claude Code that maintains project information, technology stack, tasks, decisions, and session history between coding sessions, eliminating the need to re-explain project context.Last updated -9MIT License
Appeared in Searches
- Understanding the concept of 'handle' in different contexts
- A search for information about thinking or related concepts
- How to add documentation to a project or system
- An open-source vector database for similarity search and AI applications
- Why do models always lose memory and have poor memory retention?