Integrations
Mentioned as a potential cloud storage option where users should ensure sync is complete before accessing from another device.
MCP Memory Service
An MCP server providing semantic memory and persistent storage capabilities for Claude Desktop using ChromaDB and sentence transformers. This service enables long-term memory storage with semantic search capabilities, making it ideal for maintaining context across conversations and instances.
Help
Talk to the Repo with TalkToGitHub!
Features
- Semantic search using sentence transformers
- Natural language time-based recall (e.g., "last week", "yesterday morning")
- Tag-based memory retrieval system
- Persistent storage using ChromaDB
- Automatic database backups
- Memory optimization tools
- Exact match retrieval
- Debug mode for similarity analysis
- Database health monitoring
- Duplicate detection and cleanup
- Customizable embedding model
- Cross-platform compatibility (Apple Silicon, Intel, Windows, Linux)
- Hardware-aware optimizations for different environments
- Graceful fallbacks for limited hardware resources
Installation
Quick Start (Recommended)
The enhanced installation script automatically detects your system and installs the appropriate dependencies:
The install.py
script will:
- Detect your system architecture and available hardware accelerators
- Install the appropriate dependencies for your platform
- Configure the optimal settings for your environment
- Verify the installation and provide diagnostics if needed
Docker Installation
You can run the Memory Service using Docker:
We provide multiple Docker Compose configurations for different scenarios:
docker-compose.yml
- Standard configuration using pip installdocker-compose.uv.yml
- Alternative configuration using UV package managerdocker-compose.pythonpath.yml
- Configuration with explicit PYTHONPATH settings
To use an alternative configuration:
Windows Installation (Special Case)
Windows users may encounter PyTorch installation issues due to platform-specific wheel availability. Use our Windows-specific installation script:
This script handles:
- Detecting CUDA availability and version
- Installing the appropriate PyTorch version from the correct index URL
- Installing other dependencies without conflicting with PyTorch
- Verifying the installation
Installing via Smithery
To install Memory Service for Claude Desktop automatically via Smithery:
Detailed Installation Guide
For comprehensive installation instructions and troubleshooting, see the Installation Guide.
Claude MCP Configuration
Standard Configuration
Add the following to your claude_desktop_config.json
file:
Windows-Specific Configuration (Recommended)
For Windows users, we recommend using the wrapper script to ensure PyTorch is properly installed:
The wrapper script will:
- Check if PyTorch is installed and properly configured
- Install PyTorch with the correct index URL if needed
- Run the memory server with the appropriate configuration
Usage Guide
For detailed instructions on how to interact with the memory service in Claude Desktop:
- Invocation Guide - Learn the specific keywords and phrases that trigger memory operations in Claude
- Installation Guide - Detailed setup instructions
The memory service is invoked through natural language commands in your conversations with Claude. For example:
- To store: "Please remember that my project deadline is May 15th."
- To retrieve: "Do you remember what I told you about my project deadline?"
- To delete: "Please forget what I told you about my address."
See the Invocation Guide for a complete list of commands and detailed usage examples.
Memory Operations
The memory service provides the following operations through the MCP server:
Core Memory Operations
store_memory
- Store new information with optional tagsretrieve_memory
- Perform semantic search for relevant memoriesrecall_memory
- Retrieve memories using natural language time expressionssearch_by_tag
- Find memories using specific tagsexact_match_retrieve
- Find memories with exact content matchdebug_retrieve
- Retrieve memories with similarity scores
Database Management
create_backup
- Create database backupget_stats
- Get memory statisticsoptimize_db
- Optimize database performancecheck_database_health
- Get database health metricscheck_embedding_model
- Verify model status
Memory Management
delete_memory
- Delete specific memory by hashdelete_by_tag
- Delete all memories with specific tagcleanup_duplicates
- Remove duplicate entries
Configuration Options
Configure through environment variables:
Hardware Compatibility
Platform | Architecture | Accelerator | Status |
---|---|---|---|
macOS | Apple Silicon (M1/M2/M3) | MPS | ✅ Fully supported |
macOS | Apple Silicon under Rosetta 2 | CPU | ✅ Supported with fallbacks |
macOS | Intel | CPU | ✅ Fully supported |
Windows | x86_64 | CUDA | ✅ Fully supported |
Windows | x86_64 | DirectML | ✅ Supported |
Windows | x86_64 | CPU | ✅ Supported with fallbacks |
Linux | x86_64 | CUDA | ✅ Fully supported |
Linux | x86_64 | ROCm | ✅ Supported |
Linux | x86_64 | CPU | ✅ Supported with fallbacks |
Linux | ARM64 | CPU | ✅ Supported with fallbacks |
Testing
Troubleshooting
See the Installation Guide for detailed troubleshooting steps.
Quick Troubleshooting Tips
- Windows PyTorch errors: Use
python scripts/install_windows.py
- macOS Intel dependency conflicts: Use
python install.py --force-compatible-deps
- Recursion errors: Run
python scripts/fix_sitecustomize.py
- Environment verification: Run
python scripts/verify_environment_enhanced.py
- Memory issues: Set
MCP_MEMORY_BATCH_SIZE=4
and try a smaller model - Apple Silicon: Ensure Python 3.10+ built for ARM64, set
PYTORCH_ENABLE_MPS_FALLBACK=1
- Installation testing: Run
python scripts/test_installation.py
Project Structure
Development Guidelines
- Python 3.10+ with type hints
- Use dataclasses for models
- Triple-quoted docstrings for modules and functions
- Async/await pattern for all I/O operations
- Follow PEP 8 style guidelines
- Include tests for new features
License
MIT License - See LICENSE file for details
Acknowledgments
- ChromaDB team for the vector database
- Sentence Transformers project for embedding models
- MCP project for the protocol specification
Contact
Integrations
The MCP Memory Service can be extended with various tools and utilities. See Integrations for a list of available options, including:
- MCP Memory Dashboard - Web UI for browsing and managing memories
- Claude Memory Context - Inject memory context into Claude project instructions
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Provides semantic memory and persistent storage for Claude, leveraging ChromaDB and sentence transformers for enhanced search and retrieval capabilities.
- Help
- Features
- Installation
- Claude MCP Configuration
- Usage Guide
- Memory Operations
- Configuration Options
- Hardware Compatibility
- Testing
- Troubleshooting
- Project Structure
- Development Guidelines
- License
- Acknowledgments
- Contact
- Integrations
Related Resources
Related MCP Servers
- -securityFlicense-qualityEnables LLMs to perform semantic search and document management using ChromaDB, supporting natural language queries with intuitive similarity metrics for retrieval augmented generation applications.Last updated -Python
- AsecurityAlicenseAqualityCline MCP integration that allows users to save, search, and format memories with semantic understanding, providing tools to store and retrieve information using vector embeddings for meaning-based search.Last updated -61JavaScriptMIT License
Chroma MCP Serverofficial
-securityAlicense-qualityA server that provides data retrieval capabilities powered by Chroma embedding database, enabling AI models to create collections over generated data and user inputs, and retrieve that data using vector search, full text search, and metadata filtering.Last updated -71PythonApache 2.0- -securityFlicense-qualityConnects Bear Notes to AI assistants using semantic search and RAG (Retrieval-Augmented Generation), allowing AI systems to access and understand your personal knowledge base through meaningful search rather than just keyword matching.Last updated -JavaScript