MemoVault
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MemoVaultRemember that I prefer using Python for backend development"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MemoVault
A simplified personal memory system for AI assistants, designed for Claude Code integration via MCP.
Features
MCP Server: First-class integration with Claude Code
Flexible Backends: Support for OpenAI and Ollama (local) LLMs
Vector Search: Semantic memory retrieval using Qdrant
Simple JSON Storage: Lightweight option for basic use cases
Easy Configuration: Environment-based setup
Quick Start
Installation
pip install memovault
# For local embeddings (optional)
pip install memovault[local]Basic Usage
from memovault import MemoVault
# Initialize with default settings (reads from .env)
mem = MemoVault()
# Add memories
mem.add("I prefer Python for backend development")
mem.add("My project deadline is March 15th")
# Search for relevant memories
results = mem.search("programming preferences")
for result in results:
print(result.memory)
# Chat with memory context
response = mem.chat("What language should I use for my backend?")
print(response)
# Save memories to disk
mem.dump("./my_memories")Claude Code Integration
Configure MemoVault in your Claude Code settings:
{
"mcpServers": {
"memovault": {
"command": "memovault-mcp",
"env": {
"MEMOVAULT_LLM_BACKEND": "openai",
"MEMOVAULT_OPENAI_API_KEY": "sk-..."
}
}
}
}Use memory commands in Claude Code:
"Remember that I prefer dark mode"
"What do you know about my preferences?"
Configuration
Copy .env.example to .env and customize:
# LLM Backend
MEMOVAULT_LLM_BACKEND=openai # or "ollama"
MEMOVAULT_OPENAI_API_KEY=sk-...
MEMOVAULT_OPENAI_MODEL=gpt-4o-mini
# Embedder Backend
MEMOVAULT_EMBEDDER_BACKEND=openai # or "ollama", "sentence_transformer"
# Memory Backend
MEMOVAULT_MEMORY_BACKEND=vector # or "simple"
# Storage
MEMOVAULT_DATA_DIR=./memovault_dataMCP Tools
Tool | Description |
| Store new information |
| Find relevant memories |
| Memory-enhanced chat |
| Retrieve specific memory by ID |
| Remove a memory |
| Show recent memories |
| Clear all memories |
Architecture
MemoVault/
├── src/memovault/
│ ├── core/ # Main MemoVault class
│ ├── memory/ # Memory backends (simple, vector)
│ ├── llm/ # LLM providers (OpenAI, Ollama)
│ ├── embedder/ # Embedding providers
│ ├── vecdb/ # Vector database (Qdrant)
│ ├── config/ # Configuration management
│ └── api/ # MCP server & REST APILicense
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Blvckjs96/MemoVault'
If you have feedback or need assistance with the MCP directory API, please join our Discord server