Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Local Mem0 MCP ServerRemember that I prefer using the MIT license for my open source projects"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Local Mem0 MCP Server
A fully self-hosted Model Context Protocol (MCP) server that integrates Mem0 for persistent memory capabilities. Enables AI assistants like Claude to store and retrieve contextual information across conversations.
β¨ Features
π§ Persistent Memory: Store and retrieve memories across conversations
π Fully Self-Hosted: No external APIs or cloud dependencies
π³ Containerized: Complete Docker deployment with one command
π Easy Installation: Single script setup for Windows, Mac, and Linux
π€ Local AI Models: Uses Ollama with phi3:mini and nomic-embed-text
π Vector Storage: PostgreSQL with pgvector for efficient memory search
π MCP Compatible: Works with Claude Desktop and other MCP-capable AI tools
π Quick Start
Prerequisites
Docker Desktop installed and running
Claude Desktop (for testing)
Installation
Windows:
Mac/Linux:
The installation will:
Build the MCP server container
Start PostgreSQL and Ollama services
Download AI models (~2.5GB total)
Configure Claude Desktop integration
Test the installation
Testing
After installation and configuration:
Restart Claude Desktop completely (close and reopen)
Verify MCP server: Type
/mcp- should listmem0-localas availableTest memory storage: "Remember that I'm testing the MCP memory system today"
Test memory retrieval: "What do you remember about me?"
Verify persistence: Restart Claude Desktop and ask again - memories should persist
Troubleshooting MCP Connection:
If
/mcpshows no servers, check the configuration file path and JSON syntaxEnsure Docker containers are running:
docker psCheck MCP server logs:
docker logs mem0-mcp-server
ποΈ Architecture
π§ Configuration
Claude Desktop MCP Configuration
After installation, configure Claude Desktop to use the MCP server:
Windows:
Edit %APPDATA%\Claude\claude_desktop_config.json:
Mac:
Edit ~/Library/Application Support/Claude/claude_desktop_config.json:
Linux:
Edit ~/.config/Claude/claude_desktop_config.json:
Add this configuration:
System Configuration
The system is configured for local operation by default:
MCP Server: Runs in Docker container with STDIO transport
Database: PostgreSQL with pgvector on port 5432
AI Models: Local Ollama instance on port 11434
Memory Storage: User-isolated memories with vector embeddings
π Available Memory Operations
add_memory: Store new memories
search_memories: Find relevant memories by query
get_all_memories: Retrieve all memories for a user
update_memory: Modify existing memories
delete_memory: Remove specific memories
delete_all_memories: Clear all memories for a user
get_memory_stats: Get memory statistics
π Troubleshooting
Check services:
View logs:
Restart services:
Clean restart:
π€ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
π License
This project is licensed under the MIT License - see the LICENSE file for details.