The Cognee MCP server is a multi-functional tool for managing knowledge graphs with four main capabilities:
- Cognify: Converts text into a structured knowledge graph
- Codify: Transforms a codebase into a knowledge graph
- Search: Allows searching within the knowledge graph with customizable search types
- Prune: Simplifies and optimizes the knowledge graph as needed
cognee MCP server
Installing Manually
A MCP server project
- Clone the cognee repo
- Install dependencies
- Activate the venv with
- Add the new server to your Claude config:
The file should be located here: ~/Library/Application\ Support/Claude/
You need to create claude_desktop_config.json in this folder if it doesn't exist Make sure to add your paths and LLM API key to the file bellow Use your editor of choice, for example Nano:
Restart your Claude desktop.
Installing via Smithery
To install Cognee for Claude Desktop automatically via Smithery:
Define cognify tool in server.py Restart your Claude desktop.
To use debugger, run:
Open inspector with timeout passed:
To apply new changes while developing cognee you need to do:
poetry lock
in cognee folderuv sync --dev --all-extras --reinstall
mcp dev src/server.py
Development
In order to use local cognee build, run in root of the cognee repo:
After the build process is done, change the cognee library dependency inside the cognee-mcp/pyproject.toml
from
to
After that add the following snippet to the same file (cognee-mcp/pyproject.toml
).
You must be authenticated.
local-only server
The server can only run on the client's local machine because it depends on local resources.
Memory manager for AI apps and Agents using various graph and vector stores and allowing ingestion from 30+ data sources
Related MCP Servers
- AsecurityAlicenseAqualityMemory Bank Server provides a set of tools and resources for AI assistants to interact with Memory Banks. Memory Banks are structured repositories of information that help maintain context and track progress across multiple sessions.Last updated -1511113TypeScriptMIT License
- AsecurityAlicenseAqualityA flexible memory system for AI applications that supports multiple LLM providers and can be used either as an MCP server or as a direct library integration, enabling autonomous memory management without explicit commands.Last updated -321151JavaScriptMIT License
- -securityAlicense-qualityAllows AI models to interact with SourceSync.ai's knowledge management platform to organize, ingest, retrieve, and search content in knowledge bases.Last updated -141TypeScriptMIT License
- -securityFlicense-qualityImplements long-term memory capabilities for AI assistants using PostgreSQL with pgvector for efficient vector similarity search, enabling semantic retrieval of stored information.Last updated -1JavaScript
Appeared in Searches
- How to log in to Discord as a user
- Hybrid Memory Models Combining Relational, Graph, and RAG Approaches
- A platform for managing YouTube channel content and analytics
- Information about Ada (could refer to Ada Lovelace, ADA cryptocurrency, or other topics)
- Finding an MCP for Assistance with Trading Bot Development