# Local Mem0 MCP Server
A fully self-hosted Model Context Protocol (MCP) server that integrates [Mem0](https://github.com/mem0ai/mem0) for persistent memory capabilities. Enables AI assistants like Claude to store and retrieve contextual information across conversations.
## ⨠Features
- š§ **Persistent Memory**: Store and retrieve memories across conversations
- š **Fully Self-Hosted**: No external APIs or cloud dependencies
- š³ **Containerized**: Complete Docker deployment with one command
- š **Easy Installation**: Single script setup for Windows, Mac, and Linux
- š¤ **Local AI Models**: Uses Ollama with phi3:mini and nomic-embed-text
- š **Vector Storage**: PostgreSQL with pgvector for efficient memory search
- š **MCP Compatible**: Works with Claude Desktop and other MCP-capable AI tools
## š Quick Start
### Prerequisites
- [Docker Desktop](https://www.docker.com/products/docker-desktop) installed and running
- [Claude Desktop](https://claude.ai/download) (for testing)
### Installation
**Windows:**
```cmd
git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
install.bat
```
**Mac/Linux:**
```bash
git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
chmod +x install.sh
./install.sh
```
The installation will:
1. Build the MCP server container
2. Start PostgreSQL and Ollama services
3. Download AI models (~2.5GB total)
4. Configure Claude Desktop integration
5. Test the installation
### Testing
After installation and configuration:
1. **Restart Claude Desktop** completely (close and reopen)
2. **Verify MCP server**: Type `/mcp` - should list `mem0-local` as available
3. **Test memory storage**: "Remember that I'm testing the MCP memory system today"
4. **Test memory retrieval**: "What do you remember about me?"
5. **Verify persistence**: Restart Claude Desktop and ask again - memories should persist
**Troubleshooting MCP Connection:**
- If `/mcp` shows no servers, check the configuration file path and JSON syntax
- Ensure Docker containers are running: `docker ps`
- Check MCP server logs: `docker logs mem0-mcp-server`
## šļø Architecture
```
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā Claude ā ā MCP Server ā ā PostgreSQL ā
ā Desktop āāāāāŗā (FastMCP) āāāāāŗā + pgvector ā
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā
ā¼
āāāāāāāāāāāāāāāāāāāā
ā Ollama ā
ā phi3:mini + ā
ā nomic-embed-text ā
āāāāāāāāāāāāāāāāāāāā
```
## š§ Configuration
### Claude Desktop MCP Configuration
After installation, configure Claude Desktop to use the MCP server:
**Windows:**
Edit `%APPDATA%\Claude\claude_desktop_config.json`:
**Mac:**
Edit `~/Library/Application Support/Claude/claude_desktop_config.json`:
**Linux:**
Edit `~/.config/Claude/claude_desktop_config.json`:
Add this configuration:
```json
{
"mcpServers": {
"mem0-local": {
"command": "docker",
"args": [
"exec", "-i", "mem0-mcp-server",
"python", "/app/src/server.py"
]
}
}
}
```
### System Configuration
The system is configured for local operation by default:
- **MCP Server**: Runs in Docker container with STDIO transport
- **Database**: PostgreSQL with pgvector on port 5432
- **AI Models**: Local Ollama instance on port 11434
- **Memory Storage**: User-isolated memories with vector embeddings
## š Available Memory Operations
- **add_memory**: Store new memories
- **search_memories**: Find relevant memories by query
- **get_all_memories**: Retrieve all memories for a user
- **update_memory**: Modify existing memories
- **delete_memory**: Remove specific memories
- **delete_all_memories**: Clear all memories for a user
- **get_memory_stats**: Get memory statistics
## š Troubleshooting
**Check services:**
```bash
docker ps
```
**View logs:**
```bash
docker-compose -f docker-compose.local.yml logs
```
**Restart services:**
```bash
docker-compose -f docker-compose.local.yml restart
```
**Clean restart:**
```bash
docker-compose -f docker-compose.local.yml down -v
# Run install script again
```
## š¤ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
## š License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## š Acknowledgments
- [Mem0](https://github.com/mem0ai/mem0) - Memory management framework
- [FastMCP](https://github.com/jlowin/fastmcp) - MCP server implementation
- [Ollama](https://ollama.ai/) - Local AI model inference
- [pgvector](https://github.com/pgvector/pgvector) - Vector similarity search