Skip to main content
Glama

MCP Memory Tracker

by akaiserg

MCP Memory Tracker

A Model Context Protocol (MCP) server that provides persistent memory capabilities using OpenAI's vector stores. This allows AI assistants to save and search through memories across conversations.

Features

  • Save Memories: Store text-based memories in OpenAI vector stores
  • Search Memories: Semantic search through saved memories using natural language queries
  • Persistent Storage: Memories are stored in OpenAI's cloud infrastructure
  • MCP Compatible: Works with any MCP-compatible client (like Claude Desktop)

Prerequisites

  • Python 3.8+
  • OpenAI API key
  • UV package manager (recommended) or pip

Installation

  1. Clone the repository:
git clone <repository-url> cd mcp-memory-tracker
  1. Install dependencies:
# Using UV (recommended) uv sync # Or using pip pip install -r requirements.txt
  1. Set up environment variables: Create a .env file in the project root:
OPENAI_API_KEY=your_openai_api_key_here

Usage

Running the MCP Server

# Using UV uv run server.py # Or using Python directly python server.py

Available Tools

save_memory(memory: str)

Saves a text memory to the vector store.

Parameters:

  • memory (string): The text content to save

Returns:

{ "status": "saved", "vector store id": "vs_xxxxx" }

Example:

save_memory("I met John at the coffee shop on Main Street. He's a software engineer who loves hiking.")
search_memories(query: str)

Searches through saved memories using semantic search.

Parameters:

  • query (string): Natural language search query

Returns:

{ "status": "success", "results": ["matching memory content..."] }

Example:

search_memories("Who did I meet at the coffee shop?")

Integration with MCP Clients

Claude Desktop

Add this server to your Claude Desktop configuration:

{ "mcpServers": { "memory-tracker": { "command": "uv", "args": ["run", "/path/to/mcp-memory-tracker/server.py"], "env": { "OPENAI_API_KEY": "your_api_key_here" } } } }

Other MCP Clients

This server implements the standard MCP protocol and should work with any compatible client. Refer to your client's documentation for configuration details.

How It Works

  1. Vector Store Management: The server automatically creates and manages an OpenAI vector store named "memories"
  2. Memory Storage: When you save a memory, it's uploaded as a text file to the vector store
  3. Semantic Search: The search functionality uses OpenAI's vector search capabilities to find relevant memories based on meaning, not just keywords

Configuration

The server uses the following constants that can be modified in server.py:

  • VECTOR_STORE_NAME: Name of the OpenAI vector store (default: "memories")

Dependencies

  • fastmcp: MCP server framework
  • openai: OpenAI Python SDK
  • python-dotenv: Environment variable management

Troubleshooting

Common Issues

  1. "OPENAI_API_KEY not found": Make sure your .env file is properly configured
  2. "'SyncPage' object has no attribute...": This indicates an API response structure issue - check your OpenAI SDK version
  3. File upload errors: Ensure your OpenAI API key has vector store permissions

Debug Mode

Add print statements to see detailed responses:

print(f"Vector store ID: {vector_store.id}") print(f"Search results: {results}")

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Submit a pull request

License

[Add your license here]

Support

For issues and questions:

-
security - not tested
F
license - not found
-
quality - not tested

A Model Context Protocol (MCP) server that provides persistent memory capabilities using OpenAI's vector stores, allowing AI assistants to save and search through memories across conversations.

  1. Features
    1. Prerequisites
      1. Installation
        1. Usage
          1. Running the MCP Server
          2. Available Tools
        2. Integration with MCP Clients
          1. Claude Desktop
          2. Other MCP Clients
        3. How It Works
          1. Configuration
            1. Dependencies
              1. Troubleshooting
                1. Common Issues
                2. Debug Mode
              2. Contributing
                1. License
                  1. Support

                    Related MCP Servers

                    • -
                      security
                      F
                      license
                      -
                      quality
                      Model Context Protocol (MCP) server implementation for semantic search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic search capabilities. You can use Claude and Cline AI Also
                      Last updated -
                      6
                      Python
                      • Apple
                    • -
                      security
                      A
                      license
                      -
                      quality
                      A Model Context Protocol (MCP) server that provides persistent memory and context management for AI systems through a structured 5-phase optimization workflow.
                      Last updated -
                      TypeScript
                      MIT License
                    • A
                      security
                      F
                      license
                      A
                      quality
                      An open-source server implementing the Model Context Protocol (MCP) that enables capturing insights from AI sessions and transforming them into persistent, searchable knowledge accessible across tools.
                      Last updated -
                      7
                      5
                      TypeScript
                      • Apple
                    • -
                      security
                      A
                      license
                      -
                      quality
                      Model Context Protocol (MCP) server that provides AI assistants with advanced web research capabilities, including Google search integration, intelligent content extraction, and multi-source synthesis.
                      Last updated -
                      27
                      2
                      TypeScript
                      MIT License

                    View all related MCP servers

                    MCP directory API

                    We provide all the information about MCP servers via our MCP API.

                    curl -X GET 'https://glama.ai/api/mcp/v1/servers/akaiserg/mcp-memory-tracker'

                    If you have feedback or need assistance with the MCP directory API, please join our Discord server