Uses OpenAI's embedding models (text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002) to generate vector embeddings for semantic search capabilities in memory storage and retrieval
Stores memories and vector embeddings in a SQLite database with sqlite-vec extension for efficient vector similarity search and memory management
MCPMem
A robust Model Context Protocol (MCP) tool for storing and searching memories with semantic search capabilities using SQLite and embeddings.
Author
Jay Simons - https://yaa.bz
Features
- 🧠 Memory Storage: Store text-based memories with metadata
- 🔍 Semantic Search: Find memories by meaning, not just keywords
- ⚡ Vector Embeddings: Uses OpenAI's embedding models for semantic understanding
- 🗄️ SQLite Backend: Lightweight, local database with vector search capabilities
- 🔧 MCP Compatible: Works with any MCP-compatible AI assistant
- 💻 CLI Tools: Full command-line interface for memory management
- 📦 Easy Installation: Install via npm and start using immediately
- ⚙️ Flexible Config: Use config files or environment variables
Installation
Global Installation (Recommended)
Quick Start
Option 1: Using Environment Variables (Simplest)
Option 2: Using Configuration File
- Initialize configuration:This creates
mcpmem.config.json
and updates.gitignore
. - Edit the configuration file and add your OpenAI API key:
- Test the configuration:
CLI Usage
MCPMem provides a comprehensive command-line interface for managing memories:
📝 Storing Memories
🔍 Searching Memories
📋 Listing Memories
🔍 Getting Specific Memory
🗑️ Deleting Memories
📊 Database Info
📚 Help
MCP Server Usage
Using with Cursor/Claude Desktop
Add to your MCP configuration file:
With Environment Variables (Recommended)
Available MCP Tools
When running as an MCP server, the following tools are available:
store_memory
: Store a new memory with optional metadatasearch_memories
: Search memories using semantic similarityget_memory
: Retrieve a specific memory by IDget_all_memories
: Get all memories (most recent first)update_memory
: Update an existing memorydelete_memory
: Delete a memory by IDget_memory_stats
: Get statistics about the memory databaseget_version
: Get the version of mcpmemls_db
: Show database file location and detailsclear_all_memories
: Delete all memories from the database
Examples
CLI Examples
MCP Usage Examples
When connected to an MCP-compatible assistant:
Development
Building
Testing
Database
MCPMem uses SQLite with the sqlite-vec
extension for vector similarity search. The database schema includes:
- memories: Stores memory content, metadata, and timestamps
- embeddings: Stores vector embeddings for semantic search
The database file is created automatically and includes proper indexing for fast retrieval.
Supported Embedding Models
Currently supports OpenAI embedding models:
text-embedding-3-small
(1536 dimensions, default)text-embedding-3-large
(3072 dimensions)text-embedding-ada-002
(1536 dimensions, legacy)
Troubleshooting
Common Issues
- "OPENAI_API_KEY environment variable is required"
- Set the environment variable:
export OPENAI_API_KEY=sk-...
- Or add it to your
mcpmem.config.json
file
- Set the environment variable:
- "Could not determine executable to run" (with npx)
- The package might not be published yet
- Use local installation instead:
npm install -g /path/to/mcpmem
- Database permission errors
- Ensure the directory for the database path exists and is writable
- MCPMem automatically creates parent directories
- Vector search not working
- Ensure you have a valid OpenAI API key
- Check that embeddings are being generated:
mcpmem stats
Debug Commands
License
MIT
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
For more information and updates, visit the GitHub repository.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables AI assistants to store and retrieve memories with semantic search capabilities using vector embeddings. Provides persistent memory storage with SQLite backend for context retention across conversations.
Related MCP Servers
- -securityFlicense-qualityImplements long-term memory capabilities for AI assistants using PostgreSQL with pgvector for efficient vector similarity search, enabling semantic retrieval of stored information.Last updated -40
- AsecurityAlicenseAqualityProvides a structured documentation system for context preservation in AI assistant environments, helping users create and manage memory banks for their projects.Last updated -367MIT License
- -securityAlicense-qualityA lightweight server that provides persistent memory and context management for AI assistants using local vector storage and database, enabling efficient storage and retrieval of contextual information through semantic search and indexed retrieval.Last updated -1MIT License
- -securityAlicense-qualityProvides persistent local memory functionality for AI assistants, enabling them to store, retrieve, and search contextual information across conversations with SQLite-based full-text search. All data stays private on your machine while dramatically improving context retention and personalized assistance.Last updated -3MIT License