Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MemoVaultRemember that I prefer using Python for backend development"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MemoVault
A simplified personal memory system for AI assistants, designed for Claude Code integration via MCP.
Features
MCP Server: First-class integration with Claude Code
Flexible Backends: Support for OpenAI and Ollama (local) LLMs
Vector Search: Semantic memory retrieval using Qdrant
Simple JSON Storage: Lightweight option for basic use cases
Easy Configuration: Environment-based setup
Quick Start
Installation
pip install memovault
# For local embeddings (optional)
pip install memovault[local]Basic Usage
from memovault import MemoVault
# Initialize with default settings (reads from .env)
mem = MemoVault()
# Add memories
mem.add("I prefer Python for backend development")
mem.add("My project deadline is March 15th")
# Search for relevant memories
results = mem.search("programming preferences")
for result in results:
print(result.memory)
# Chat with memory context
response = mem.chat("What language should I use for my backend?")
print(response)
# Save memories to disk
mem.dump("./my_memories")Claude Code Integration
Configure MemoVault in your Claude Code settings:
{
"mcpServers": {
"memovault": {
"command": "memovault-mcp",
"env": {
"MEMOVAULT_LLM_BACKEND": "openai",
"MEMOVAULT_OPENAI_API_KEY": "sk-..."
}
}
}
}Use memory commands in Claude Code:
"Remember that I prefer dark mode"
"What do you know about my preferences?"
Configuration
Copy .env.example to .env and customize:
# LLM Backend
MEMOVAULT_LLM_BACKEND=openai # or "ollama"
MEMOVAULT_OPENAI_API_KEY=sk-...
MEMOVAULT_OPENAI_MODEL=gpt-4o-mini
# Embedder Backend
MEMOVAULT_EMBEDDER_BACKEND=openai # or "ollama", "sentence_transformer"
# Memory Backend
MEMOVAULT_MEMORY_BACKEND=vector # or "simple"
# Storage
MEMOVAULT_DATA_DIR=./memovault_dataMCP Tools
Tool | Description |
| Store new information |
| Find relevant memories |
| Memory-enhanced chat |
| Retrieve specific memory by ID |
| Remove a memory |
| Show recent memories |
| Clear all memories |
Architecture
MemoVault/
├── src/memovault/
│ ├── core/ # Main MemoVault class
│ ├── memory/ # Memory backends (simple, vector)
│ ├── llm/ # LLM providers (OpenAI, Ollama)
│ ├── embedder/ # Embedding providers
│ ├── vecdb/ # Vector database (Qdrant)
│ ├── config/ # Configuration management
│ └── api/ # MCP server & REST APILicense
MIT