Skip to main content
Glama

Mem0 MCP Server

A Model Context Protocol (MCP) server that provides memory capabilities using Mem0, backed by local Postgres (graph store) and Qdrant (vector store).

Features

  • Tools: add_memory, add_fact, search_memories, get_all_memories, database_history

  • Storage: Local Postgres (Graph) & Qdrant (Vector)

  • Transport: Supports both stdio (local) and sse (remote/docker)

  • Deployment: Ready for Docker Compose and GitHub Actions

Prerequisites

  • Docker & Docker Compose

  • Python 3.12+ (for local development)

Quick Start (Docker)

The easiest way to run the full stack (Database + MCP Server) is via Docker Compose:

  1. Configure Environment

    cp .env.example .env # Edit .env to add your OPENAI_API_KEY if needed
  2. Start Services

    # Starts Postgres, Qdrant, and the MCP Server (SSE mode on port 8000) docker-compose -f docker-compose.mem0.yml up -d --build
  3. Connect Client

    • The server runs in SSE mode on port 8000.

    • SSE Endpoint: http://localhost:8000/sse

Local Development (Python)

If you want to run the server code locally while keeping databases in Docker:

  1. Start Databases Only

    docker-compose -f docker-compose.mem0.yml up -d db qdrant
  2. Install Dependencies

    python -m venv .venv source .venv/bin/activate # or .venv\Scripts\activate on Windows pip install .
  3. Run Server

    • Stdio Mode (Default, for use with local MCP clients like Claude Desktop):

      python src/main.py
    • SSE Mode (Network accessible):

      export MCP_TRANSPORT=sse python src/main.py # Server lists on 0.0.0.0:8000

Deployment

The repository includes a GitHub Actions workflow (.github/workflows/deploy.yml) to deploy to a remote server (e.g., 192.168.254.202).

  • DB Deploy: Deploys db and qdrant services.

  • App Deploy: Deploys the mcp-server container in SSE mode, mapping port 8000.

Configuration

Variable

Description

Default

POSTGRES_USER

Postgres Username

kouhai

POSTGRES_PASSWORD

Postgres Password

password

POSTGRES_DB

Database Name

kouhai

QDRANT_HOST

Qdrant Host

localhost

MCP_TRANSPORT

Server Transport (

stdio

or

sse

)

stdio

OPENAI_API_KEY

(Optional) For OpenAI Embeddings

-

Tools

  • add_memory(content, user_id, metadata): Stores a user message.

  • add_fact(content, user_id, metadata): Stores a specific fact (tagged with type: fact).

  • search_memories(query, user_id, limit): Semantic search for relevant memories.

  • get_all_memories(user_id, limit): Retrieves all stored memories for a specific user.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tedahn/mem0-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server