Skip to main content
Glama
README.md8.35 kB
# Graphiti MCP Demo We are implementing an MCP server and AI agent integration to leverage Zep's Graphiti for persistent memory and context continuity across Cursor and Claude. This will allow AI agents hosted on Cursor and Claude to connect to the MCP for dynamic tool discovery, select the optimal tool for a given query, and formulate responses informed by past interactions, all while Graphiti ensures consistent context across both client platforms. We use: - **Graphiti by [Zep AI](https://www.getzep.com/)** as a memory layer for an AI agent - **Cursor and Claude** (as MCP Hosts) ## Set Up Follow these steps to set up the project before running the MCP server. ### Prerequisites - Python 3.10 or higher - [uv](https://github.com/astral-sh/uv) package manager (recommended) or pip - Neo4j database (use free [Neo4j Aura](https://neo4j.com/cloud/aura/) cloud instance or any Neo4j instance) - OpenRouter API key (recommended) or OpenAI API key ### Install Dependencies Using `uv` (recommended): ```bash uv sync ``` Or using `pip`: ```bash pip install -e . ``` Or install dependencies directly: ```bash pip install mcp neo4j openai python-dotenv ``` ### Configuration Before running the MCP server, you need to configure the environment variables. 1. Copy the example environment file: ```bash cp .env.example .env ``` 2. Edit the `.env` file with your actual credentials: - Replace `NEO4J_URI` with your Neo4j connection string - Replace `NEO4J_PASSWORD` with your Neo4j password - Replace `<your_openrouter_api_key>` with your OpenRouter API key (or use `OPENAI_API_KEY` for OpenAI) - Adjust `MODEL_NAME` if needed See `.env.example` for the complete configuration template. **Important**: - **Neo4j Setup**: Get a free Neo4j Aura instance at https://neo4j.com/cloud/aura/ (recommended) or use any Neo4j instance - For Neo4j Aura, use the URI format `neo4j+s://xxxxx.databases.neo4j.io` (note the `+s` for secure connection) - Replace `<your_openrouter_api_key>` with your actual OpenRouter API key (get one at https://openrouter.ai) - Or use `OPENAI_API_KEY` if you prefer to use OpenAI directly - Model names for OpenRouter should be in format `provider/model-name` (e.g., `openai/gpt-4o-mini`, `anthropic/claude-3-haiku`) ## Use MCP Server ### Run MCP Server **Simple Method (Recommended):** Use the provided script: ```powershell .\run-server.ps1 ``` **Or run directly:** For Cursor (SSE transport): ```bash # Using uv uv run graphiti_mcp_server.py --transport sse --port 8000 # Or using python directly python graphiti_mcp_server.py --transport sse --port 8000 ``` For Claude (stdio transport): ```bash uv run graphiti_mcp_server.py --transport stdio ``` The server will connect to your Neo4j instance (configured in `.env`) and start listening for connections. **Note**: Make sure your `.env` file has the correct Neo4j connection details before starting the server. ### Available Tools The MCP server provides the following tools: 1. **store_memory**: Store a memory or context in the graph database for future retrieval 2. **retrieve_memories**: Retrieve relevant memories from the graph database based on a query 3. **create_relationship**: Create a relationship between two memories or entities in the graph 4. **get_context**: Get contextual information for a given query by retrieving and synthesizing relevant memories 5. **search_graph**: Search the graph database using Cypher query ### Web UI Demo A beautiful web interface is available to interact with the Graphiti MCP Server: **Start the Web UI:** ```powershell .\run-web-ui.ps1 ``` Or directly: ```bash python web_ui_server.py ``` Then open your browser to: **http://localhost:8081** (or the port specified) The web UI provides: - **Store Memory**: Add new memories with tags and metadata - **Retrieve Memories**: Search for relevant memories using semantic search - **Get Context**: Get synthesized context from multiple memories - **Create Relationships**: Link memories together in the knowledge graph - **Search Graph**: Execute custom Cypher queries - **Browse All**: View all stored memories **Need example values?** See **[WEB_UI_EXAMPLES.md](WEB_UI_EXAMPLES.md)** for ready-to-use examples for each form! ### Example Usage For comprehensive examples and use cases, see: - **[EXAMPLE_USAGE.md](EXAMPLE_USAGE.md)**: Detailed examples showing how to use each tool with real-world scenarios - **example_usage.py**: Python script demonstrating programmatic usage of the MCP server tools To run the example script: ```bash python example_usage.py ``` ### Integrate MCP Clients #### Cursor Configuration Create or modify the `mcp.json` file in your Cursor configuration directory with the following content: ```json { "mcpServers": { "Graphiti": { "url": "http://localhost:8000/sse" } } } ``` **Note**: The exact location of the `mcp.json` file depends on your Cursor installation. Typically, it's in: - Windows: `%APPDATA%\Cursor\User\globalStorage\mcp.json` - macOS: `~/Library/Application Support/Cursor/User/globalStorage/mcp.json` - Linux: `~/.config/Cursor/User/globalStorage/mcp.json` #### Claude Desktop Configuration Create or modify the `claude_desktop_config.json` file (typically located at `~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, or similar paths on other platforms) with the following content: ```json { "mcpServers": { "graphiti": { "transport": "stdio", "command": "uv", "args": [ "run", "--isolated", "--directory", "/path/to/graphiti_mcp", "--project", ".", "graphiti_mcp_server.py", "--transport", "stdio" ] } } } ``` **Important**: Update the `--directory` path to match your actual project directory path. Alternatively, if you have `uv` in your PATH, you can use: ```json { "mcpServers": { "graphiti": { "transport": "stdio", "command": "uv", "args": [ "run", "--isolated", "--directory", "/path/to/graphiti_mcp", "graphiti_mcp_server.py", "--transport", "stdio" ] } } } ``` ## Architecture The Graphiti MCP server uses: - **Neo4j**: Graph database for storing memories and relationships - **OpenRouter/OpenAI**: For generating embeddings and synthesizing context (OpenRouter recommended for access to multiple models) - **MCP Protocol**: For communication with AI agent hosts (Cursor, Claude) Memories are stored as nodes in Neo4j with: - Content (text) - Embeddings (vector representations) - Metadata (optional key-value pairs) - Tags (for categorization) - Timestamps Relationships between memories can be created to build a knowledge graph. ## Troubleshooting ### Connection Issues - **Neo4j Connection Error**: Ensure Neo4j is running and accessible at the configured URI - **OpenRouter/OpenAI API Error**: Verify your API key is correct and has sufficient credits - For OpenRouter: Check your API key at https://openrouter.ai/keys - For OpenAI: Check your API key at https://platform.openai.com/api-keys - **MCP Server Not Starting**: Check that all dependencies are installed correctly ### Neo4j Connection Issues - **Neo4j Connection Error**: - Ensure your Neo4j instance is running and accessible - For Neo4j Aura: Check that your IP is whitelisted in Aura settings - Verify the URI format: Use `neo4j+s://` for Aura, `bolt://` for local instances - Test connection manually using Neo4j Browser or cypher-shell - **Connection Timeout**: - Check your firewall settings - Verify the Neo4j URI, username, and password in `.env` - For Aura, ensure your IP address is whitelisted in the Aura dashboard ### Performance - For better vector search performance, consider setting up Neo4j's vector index - The current implementation uses a simplified similarity search; for production, use Neo4j's Graph Data Science library ## Development ### Running Tests ```bash pytest ``` ### Code Formatting ```bash black graphiti_mcp_server.py ruff check graphiti_mcp_server.py ``` ## License This project is open source and available under the MIT License. ## Contribution Contributions are welcome! Feel free to fork this repository and submit pull requests with your improvements.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/apexneural-hansika/graphiti_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server