Skip to main content
Glama

cognee-mcp

cognee MCP server

Installing Manually

A MCP server project

  1. Clone the cognee repo
  2. Install dependencies
brew install uv
cd cognee-mcp uv sync --dev --all-extras --reinstall
  1. Activate the venv with
source .venv/bin/activate
  1. Add the new server to your Claude config:

The file should be located here: ~/Library/Application\ Support/Claude/

cd ~/Library/Application\ Support/Claude/

You need to create claude_desktop_config.json in this folder if it doesn't exist Make sure to add your paths and LLM API key to the file bellow Use your editor of choice, for example Nano:

nano claude_desktop_config.json
{ "mcpServers": { "cognee": { "command": "/Users/{user}/cognee/.venv/bin/uv", "args": [ "--directory", "/Users/{user}/cognee/cognee-mcp", "run", "cognee" ], "env": { "ENV": "local", "TOKENIZERS_PARALLELISM": "false", "LLM_API_KEY": "sk-" } } } }

Restart your Claude desktop.

Installing via Smithery

To install Cognee for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install cognee --client claude

Define cognify tool in server.py Restart your Claude desktop.

To use debugger, run:

mcp dev src/server.py

Open inspector with timeout passed:

http://localhost:5173?timeout=120000

To apply new changes while developing cognee you need to do:

  1. poetry lock in cognee folder
  2. uv sync --dev --all-extras --reinstall
  3. mcp dev src/server.py

Development

In order to use local cognee build, run in root of the cognee repo:

poetry build -o ./cognee-mcp/sources

After the build process is done, change the cognee library dependency inside the cognee-mcp/pyproject.toml from

cognee[postgres,codegraph,gemini,huggingface]==0.1.38

to

cognee[postgres,codegraph,gemini,huggingface]

After that add the following snippet to the same file (cognee-mcp/pyproject.toml).

[tool.uv.sources] cognee = { path = "sources/cognee-0.1.38-py3-none-any.whl" }

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

Memory manager for AI apps and Agents using various graph and vector stores and allowing ingestion from 30+ data sources

  1. Installing Manually
    1. Installing via Smithery
      1. Development

        Related MCP Servers

        • A
          security
          A
          license
          A
          quality
          Memory Bank Server provides a set of tools and resources for AI assistants to interact with Memory Banks. Memory Banks are structured repositories of information that help maintain context and track progress across multiple sessions.
          Last updated -
          15
          111
          13
          TypeScript
          MIT License
        • A
          security
          A
          license
          A
          quality
          A flexible memory system for AI applications that supports multiple LLM providers and can be used either as an MCP server or as a direct library integration, enabling autonomous memory management without explicit commands.
          Last updated -
          3
          211
          51
          JavaScript
          MIT License
        • -
          security
          A
          license
          -
          quality
          Allows AI models to interact with SourceSync.ai's knowledge management platform to organize, ingest, retrieve, and search content in knowledge bases.
          Last updated -
          14
          1
          TypeScript
          MIT License
          • Apple
          • Linux
        • -
          security
          F
          license
          -
          quality
          Implements long-term memory capabilities for AI assistants using PostgreSQL with pgvector for efficient vector similarity search, enabling semantic retrieval of stored information.
          Last updated -
          1
          JavaScript
          • Apple
          • Linux

        View all related MCP servers

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server