Supports deployment through Docker and Docker Compose with preconfigured containers for both the MCP server and Neo4j database
Enables configuration through .env files for setting API keys, database credentials, and other environment variables
Provides installation instructions via GitHub repo cloning for accessing the Graphiti framework
Uses Neo4j as the database backend for storing and querying the knowledge graph, supporting entity relationships and temporal data management
Leverages OpenAI's LLM capabilities for inference operations and embeddings within the knowledge graph framework
Graphiti MCP Server
Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti continuously integrates user interactions, structured and unstructured enterprise data, and external information into a coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical queries without requiring complete graph recomputation, making it suitable for developing interactive, context-aware AI applications.
This is an experimental Model Context Protocol (MCP) server implementation for Graphiti. The MCP server exposes Graphiti's key functionality through the MCP protocol, allowing AI assistants to interact with Graphiti's knowledge graph capabilities.
Features
The Graphiti MCP server exposes the following key high-level functions of Graphiti:
Episode Management: Add, retrieve, and delete episodes (text, messages, or JSON data)
Entity Management: Search and manage entity nodes and relationships in the knowledge graph
Search Capabilities: Search for facts (edges) and node summaries using semantic and hybrid search
Group Management: Organize and manage groups of related data with group_id filtering
Graph Maintenance: Clear the graph and rebuild indices
Quick Start for Claude Desktop, Cursor, and other clients
Clone the Graphiti GitHub repo
or
Note the full path to this directory.
Install the Graphiti prerequisites.
Configure Claude, Cursor, or other MCP client to use Graphiti with a . See the client documentation on where to find their MCP configuration files.
Installation
Prerequisites
Ensure you have Python 3.10 or higher installed.
A running Neo4j database (version 5.26 or later required)
OpenAI API key for LLM operations
Setup
Clone the repository and navigate to the mcp_server directory
Use
uv
to create a virtual environment and install dependencies:
Configuration
The server uses the following environment variables:
NEO4J_URI
: URI for the Neo4j database (default:bolt://localhost:7687
)NEO4J_USER
: Neo4j username (default:neo4j
)NEO4J_PASSWORD
: Neo4j password (default:demodemo
)OPENAI_API_KEY
: OpenAI API key (required for LLM operations)OPENAI_BASE_URL
: Optional base URL for OpenAI APIMODEL_NAME
: OpenAI model name to use for LLM operations.SMALL_MODEL_NAME
: OpenAI model name to use for smaller LLM operations.LLM_TEMPERATURE
: Temperature for LLM responses (0.0-2.0).AZURE_OPENAI_ENDPOINT
: Optional Azure OpenAI endpoint URLAZURE_OPENAI_DEPLOYMENT_NAME
: Optional Azure OpenAI deployment nameAZURE_OPENAI_API_VERSION
: Optional Azure OpenAI API versionAZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME
: Optional Azure OpenAI embedding deployment nameAZURE_OPENAI_EMBEDDING_API_VERSION
: Optional Azure OpenAI API versionAZURE_OPENAI_USE_MANAGED_IDENTITY
: Optional use Azure Managed Identities for authentication
You can set these variables in a .env
file in the project directory.
Running the Server
To run the Graphiti MCP server directly using uv
:
With options:
Available arguments:
--model
: Overrides theMODEL_NAME
environment variable.--small-model
: Overrides theSMALL_MODEL_NAME
environment variable.--temperature
: Overrides theLLM_TEMPERATURE
environment variable.--transport
: Choose the transport method (sse or stdio, default: sse)--group-id
: Set a namespace for the graph (optional). If not provided, defaults to "default".--destroy-graph
: If set, destroys all Graphiti graphs on startup.--use-custom-entities
: Enable entity extraction using the predefined ENTITY_TYPES
Docker Deployment
The Graphiti MCP server can be deployed using Docker. The Dockerfile uses uv
for package management, ensuring
consistent dependency installation.
Environment Configuration
Before running the Docker Compose setup, you need to configure the environment variables. You have two options:
Using a .env file (recommended):
Copy the provided
.env.example
file to create a.env
file:cp .env.example .envEdit the
.env
file to set your OpenAI API key and other configuration options:# Required for LLM operations OPENAI_API_KEY=your_openai_api_key_here MODEL_NAME=gpt-4.1-mini # Optional: OPENAI_BASE_URL only needed for non-standard OpenAI endpoints # OPENAI_BASE_URL=https://api.openai.com/v1The Docker Compose setup is configured to use this file if it exists (it's optional)
Using environment variables directly:
You can also set the environment variables when running the Docker Compose command:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up
Neo4j Configuration
The Docker Compose setup includes a Neo4j container with the following default configuration:
Username:
neo4j
Password:
demodemo
URI:
bolt://neo4j:7687
(from within the Docker network)Memory settings optimized for development use
Running with Docker Compose
Start the services using Docker Compose:
Or if you're using an older version of Docker Compose:
This will start both the Neo4j database and the Graphiti MCP server. The Docker setup:
Uses
uv
for package management and running the serverInstalls dependencies from the
pyproject.toml
fileConnects to the Neo4j container using the environment variables
Exposes the server on port 8000 for HTTP-based SSE transport
Includes a healthcheck for Neo4j to ensure it's fully operational before starting the MCP server
Integrating with MCP Clients
Configuration
To use the Graphiti MCP server with an MCP-compatible client, configure it to connect to the server:
You will need the Python package manager,uv
installed. Please refer to the uv
.
Ensure that you set the full path to the uv
binary and your Graphiti project folder.
For SSE transport (HTTP-based), you can use this configuration:
Available Tools
The Graphiti MCP server exposes the following tools:
add_episode
: Add an episode to the knowledge graph (supports text, JSON, and message formats)search_nodes
: Search the knowledge graph for relevant node summariessearch_facts
: Search the knowledge graph for relevant facts (edges between entities)delete_entity_edge
: Delete an entity edge from the knowledge graphdelete_episode
: Delete an episode from the knowledge graphget_entity_edge
: Get an entity edge by its UUIDget_episodes
: Get the most recent episodes for a specific groupclear_graph
: Clear all data from the knowledge graph and rebuild indicesget_status
: Get the status of the Graphiti MCP server and Neo4j connection
Working with JSON Data
The Graphiti MCP server can process structured JSON data through the add_episode
tool with source="json"
. This
allows you to automatically extract entities and relationships from structured data:
Integrating with the Cursor IDE
To integrate the Graphiti MCP Server with the Cursor IDE, follow these steps:
Run the Graphiti MCP server using the SSE transport:
Hint: specify a group_id
to namespace graph data. If you do not specify a group_id
, the server will use "default" as the group_id.
or
Configure Cursor to connect to the Graphiti MCP server.
Add the Graphiti rules to Cursor's User Rules. See cursor_rules.md for details.
Kick off an agent session in Cursor.
The integration enables AI assistants in Cursor to maintain persistent memory through Graphiti's knowledge graph capabilities.
Integrating with Claude Desktop (Docker MCP Server)
The Graphiti MCP Server container uses the SSE MCP transport. Claude Desktop does not natively support SSE, so you'll need to use a gateway like mcp-remote
.
Run the Graphiti MCP server using SSE transport:
docker compose up(Optional) Install : If you prefer to have
mcp-remote
installed globally, or if you encounter issues withnpx
fetching the package, you can install it globally. Otherwise,npx
(used in the next step) will handle it for you.npm install -g mcp-remoteConfigure Claude Desktop: Open your Claude Desktop configuration file (usually
claude_desktop_config.json
) and add or modify themcpServers
section as follows:{ "mcpServers": { "graphiti-memory": { // You can choose a different name if you prefer "command": "npx", // Or the full path to mcp-remote if npx is not in your PATH "args": [ "mcp-remote", "http://localhost:8000/sse" // Ensure this matches your Graphiti server's SSE endpoint ] } } }If you already have an
mcpServers
entry, addgraphiti-memory
(or your chosen name) as a new key within it.Restart Claude Desktop for the changes to take effect.
Requirements
Python 3.10 or higher
Neo4j database (version 5.26 or later required)
OpenAI API key (for LLM operations and embeddings)
MCP-compatible client
License
This project is licensed under the same license as the parent Graphiti project.
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Graphiti MCP Server
Related MCP Servers
- AsecurityAlicenseAqualityAlchemy MCP ServerLast updated -117768MIT License
- MIT License