Skip to main content
Glama

MCPMem

by designly1

MCPMem

NPM Version Static Badge NPM License

TypeScript Node.js OpenAI

A robust Model Context Protocol (MCP) tool for storing and searching memories with semantic search capabilities using SQLite and embeddings.

Author

Jay Simons - https://yaa.bz

Features

  • 🧠 Memory Storage: Store text-based memories with metadata
  • 🔍 Semantic Search: Find memories by meaning, not just keywords
  • Vector Embeddings: Uses OpenAI's embedding models for semantic understanding
  • 🗄️ SQLite Backend: Lightweight, local database with vector search capabilities
  • 🔧 MCP Compatible: Works with any MCP-compatible AI assistant
  • 💻 CLI Tools: Full command-line interface for memory management
  • 📦 Easy Installation: Install via npm and start using immediately
  • ⚙️ Flexible Config: Use config files or environment variables

Installation

npm install -g mcpmem@latest

Quick Start

Option 1: Using Environment Variables (Simplest)

# Set your API key export OPENAI_API_KEY=sk-your-openai-api-key-here # Optional: Customize model and database path export OPENAI_MODEL=text-embedding-3-small export MCPMEM_DB_PATH=/path/to/memories.db # Test the configuration mcpmem test # Start using the CLI or MCP server mcpmem stats

Option 2: Using Configuration File

  1. Initialize configuration:
    mcpmem init
    This creates mcpmem.config.json and updates .gitignore.
  2. Edit the configuration file and add your OpenAI API key:
    { "embedding": { "provider": "openai", "apiKey": "your-openai-api-key-here", "model": "text-embedding-3-small" }, "database": { "path": "./mcpmem.db" } }
  3. Test the configuration:
    mcpmem test

CLI Usage

MCPMem provides a comprehensive command-line interface for managing memories:

📝 Storing Memories

# Store a simple memory mcpmem store "Remember to review the quarterly reports" # Store memory with metadata mcpmem store "API endpoint returns 500 errors" -m '{"project":"web-app","severity":"high"}'

🔍 Searching Memories

# Semantic search mcpmem search "database issues" # Custom limits and thresholds mcpmem search "bugs" --limit 5 --threshold 0.8

📋 Listing Memories

# Show recent memories mcpmem list # Show more memories mcpmem list --limit 50

🔍 Getting Specific Memory

# Get memory details by ID mcpmem get abc123-def456-789

🗑️ Deleting Memories

# Delete with confirmation mcpmem delete abc123-def456-789 # Force delete (no confirmation) mcpmem delete abc123-def456-789 --force # Clear all memories (with confirmation) mcpmem clear # Force clear all memories (no confirmation) mcpmem clear --force

📊 Database Info

# Show database statistics mcpmem stats # Show database file location and details mcpmem ls_db

📚 Help

# Show all available commands mcpmem --help # Show detailed examples and usage mcpmem help-commands # Get help for a specific command mcpmem search --help

MCP Server Usage

Using with Cursor/Claude Desktop

Add to your MCP configuration file:

{ "mcpServers": { "mcpmem": { "command": "mcpmem", "env": { "OPENAI_API_KEY": "your-openai-api-key-here", "OPENAI_MODEL": "text-embedding-3-small", "MCPMEM_DB_PATH": "/path/to/memories.db" } } } }

Available MCP Tools

When running as an MCP server, the following tools are available:

  • store_memory: Store a new memory with optional metadata
  • search_memories: Search memories using semantic similarity
  • get_memory: Retrieve a specific memory by ID
  • get_all_memories: Get all memories (most recent first)
  • update_memory: Update an existing memory
  • delete_memory: Delete a memory by ID
  • get_memory_stats: Get statistics about the memory database
  • get_version: Get the version of mcpmem
  • ls_db: Show database file location and details
  • clear_all_memories: Delete all memories from the database

Examples

CLI Examples

# Store project-related memories mcpmem store "Fixed the authentication bug in user login" -m '{"project":"web-app","type":"bug-fix"}' mcpmem store "Meeting notes: Discussed Q4 roadmap priorities" -m '{"type":"meeting","quarter":"Q4"}' # Search for memories mcpmem search "authentication issues" mcpmem search "meeting" --limit 3 # Manage memories mcpmem list --limit 10 mcpmem get memory-id-here mcpmem delete old-memory-id --force mcpmem clear --force

MCP Usage Examples

When connected to an MCP-compatible assistant:

Assistant: I'll help you store that memory about the bug fix. *Uses store_memory tool* - Content: "Fixed authentication timeout issue in production" - Metadata: {"severity": "high", "environment": "production"} Memory stored successfully with ID: abc123-def456
Assistant: Let me search for previous issues related to authentication. *Uses search_memories tool with query "authentication problems"* Found 3 related memories: 1. Fixed authentication timeout issue (similarity: 85%) 2. Updated auth middleware configuration (similarity: 78%) 3. Resolved login redirect bug (similarity: 72%)

Development

Building

# Install dependencies pnpm install # Build the project pnpm build # Type checking pnpm tc

Testing

# Run tests pnpm test # Test configuration mcpmem test

Database

MCPMem uses SQLite with the sqlite-vec extension for vector similarity search. The database schema includes:

  • memories: Stores memory content, metadata, and timestamps
  • embeddings: Stores vector embeddings for semantic search

The database file is created automatically and includes proper indexing for fast retrieval.

Supported Embedding Models

Currently supports OpenAI embedding models:

  • text-embedding-3-small (1536 dimensions, default)
  • text-embedding-3-large (3072 dimensions)
  • text-embedding-ada-002 (1536 dimensions, legacy)

Troubleshooting

Common Issues

  1. "OPENAI_API_KEY environment variable is required"
    • Set the environment variable: export OPENAI_API_KEY=sk-...
    • Or add it to your mcpmem.config.json file
  2. "Could not determine executable to run" (with npx)
    • The package might not be published yet
    • Use local installation instead: npm install -g /path/to/mcpmem
  3. Database permission errors
    • Ensure the directory for the database path exists and is writable
    • MCPMem automatically creates parent directories
  4. Vector search not working
    • Ensure you have a valid OpenAI API key
    • Check that embeddings are being generated: mcpmem stats

Debug Commands

# Check configuration and connectivity mcpmem test # View database statistics mcpmem stats # List recent memories to verify storage mcpmem list --limit 5

License

MIT

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

For more information and updates, visit the GitHub repository.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables AI assistants to store and retrieve memories with semantic search capabilities using vector embeddings. Provides persistent memory storage with SQLite backend for context retention across conversations.

  1. Author
    1. Features
      1. Installation
        1. Global Installation (Recommended)
      2. Quick Start
        1. Option 1: Using Environment Variables (Simplest)
        2. Option 2: Using Configuration File
      3. CLI Usage
        1. 📝 Storing Memories
        2. 🔍 Searching Memories
        3. 📋 Listing Memories
        4. 🔍 Getting Specific Memory
        5. 🗑️ Deleting Memories
        6. 📊 Database Info
        7. 📚 Help
      4. MCP Server Usage
        1. Using with Cursor/Claude Desktop
        2. Available MCP Tools
      5. Examples
        1. CLI Examples
        2. MCP Usage Examples
      6. Development
        1. Building
        2. Testing
      7. Database
        1. Supported Embedding Models
          1. Troubleshooting
            1. Common Issues
            2. Debug Commands
          2. License
            1. Contributing

              Related MCP Servers

              • -
                security
                F
                license
                -
                quality
                Implements long-term memory capabilities for AI assistants using PostgreSQL with pgvector for efficient vector similarity search, enabling semantic retrieval of stored information.
                Last updated -
                40
                • Apple
                • Linux
              • A
                security
                A
                license
                A
                quality
                Provides a structured documentation system for context preservation in AI assistant environments, helping users create and manage memory banks for their projects.
                Last updated -
                3
                67
                MIT License
                • Linux
                • Apple
              • -
                security
                A
                license
                -
                quality
                A lightweight server that provides persistent memory and context management for AI assistants using local vector storage and database, enabling efficient storage and retrieval of contextual information through semantic search and indexed retrieval.
                Last updated -
                1
                MIT License
              • -
                security
                A
                license
                -
                quality
                Provides persistent local memory functionality for AI assistants, enabling them to store, retrieve, and search contextual information across conversations with SQLite-based full-text search. All data stays private on your machine while dramatically improving context retention and personalized assistance.
                Last updated -
                3
                MIT License
                • Apple
                • Linux

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/designly1/mcpmem'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server