MCP Memory Tracker
A Model Context Protocol (MCP) server that provides persistent memory capabilities using OpenAI's vector stores. This allows AI assistants to save and search through memories across conversations.
Features
- Save Memories: Store text-based memories in OpenAI vector stores
- Search Memories: Semantic search through saved memories using natural language queries
- Persistent Storage: Memories are stored in OpenAI's cloud infrastructure
- MCP Compatible: Works with any MCP-compatible client (like Claude Desktop)
Prerequisites
- Python 3.8+
- OpenAI API key
- UV package manager (recommended) or pip
Installation
- Clone the repository:
- Install dependencies:
- Set up environment variables:
Create a
.env
file in the project root:
Usage
Running the MCP Server
Available Tools
save_memory(memory: str)
Saves a text memory to the vector store.
Parameters:
memory
(string): The text content to save
Returns:
Example:
search_memories(query: str)
Searches through saved memories using semantic search.
Parameters:
query
(string): Natural language search query
Returns:
Example:
Integration with MCP Clients
Claude Desktop
Add this server to your Claude Desktop configuration:
Other MCP Clients
This server implements the standard MCP protocol and should work with any compatible client. Refer to your client's documentation for configuration details.
How It Works
- Vector Store Management: The server automatically creates and manages an OpenAI vector store named "memories"
- Memory Storage: When you save a memory, it's uploaded as a text file to the vector store
- Semantic Search: The search functionality uses OpenAI's vector search capabilities to find relevant memories based on meaning, not just keywords
Configuration
The server uses the following constants that can be modified in server.py
:
VECTOR_STORE_NAME
: Name of the OpenAI vector store (default: "memories")
Dependencies
fastmcp
: MCP server frameworkopenai
: OpenAI Python SDKpython-dotenv
: Environment variable management
Troubleshooting
Common Issues
- "OPENAI_API_KEY not found": Make sure your
.env
file is properly configured - "'SyncPage' object has no attribute...": This indicates an API response structure issue - check your OpenAI SDK version
- File upload errors: Ensure your OpenAI API key has vector store permissions
Debug Mode
Add print statements to see detailed responses:
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
License
[Add your license here]
Support
For issues and questions:
- Check the troubleshooting section
- Review OpenAI API documentation
- Check MCP protocol documentation
This server cannot be installed
A Model Context Protocol (MCP) server that provides persistent memory capabilities using OpenAI's vector stores, allowing AI assistants to save and search through memories across conversations.
Related MCP Servers
- -securityFlicense-qualityModel Context Protocol (MCP) server implementation for semantic search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic search capabilities. You can use Claude and Cline AI AlsoLast updated -6Python
- -securityAlicense-qualityA Model Context Protocol (MCP) server that provides persistent memory and context management for AI systems through a structured 5-phase optimization workflow.Last updated -TypeScriptMIT License
Lspace MCP Serverofficial
AsecurityFlicenseAqualityAn open-source server implementing the Model Context Protocol (MCP) that enables capturing insights from AI sessions and transforming them into persistent, searchable knowledge accessible across tools.Last updated -75TypeScript- -securityAlicense-qualityModel Context Protocol (MCP) server that provides AI assistants with advanced web research capabilities, including Google search integration, intelligent content extraction, and multi-source synthesis.Last updated -272TypeScriptMIT License