Skip to main content
Glama

cognee-mcp

cognee MCP 服务器

手动安装

MCP 服务器项目

  1. 克隆cognee仓库
  2. 安装依赖项
brew install uv
cd cognee-mcp uv sync --dev --all-extras --reinstall
  1. 使用以下命令激活 venv
source .venv/bin/activate
  1. 将新服务器添加到您的 Claude 配置中:

该文件应位于此处:~/Library/Application\ Support/Claude/

cd ~/Library/Application\ Support/Claude/

如果不存在,则需要在此文件夹中创建 claude_desktop_config.json 确保将您的路径和 LLM API 密钥添加到下面的文件中使用您选择的编辑器,例如 Nano:

nano claude_desktop_config.json
{ "mcpServers": { "cognee": { "command": "/Users/{user}/cognee/.venv/bin/uv", "args": [ "--directory", "/Users/{user}/cognee/cognee-mcp", "run", "cognee" ], "env": { "ENV": "local", "TOKENIZERS_PARALLELISM": "false", "LLM_API_KEY": "sk-" } } } }

重新启动您的 Claude 桌面。

通过 Smithery 安装

要通过Smithery自动安装 Cognee for Claude Desktop:

npx -y @smithery/cli install cognee --client claude

在 server.py 中定义 cognify 工具重新启动您的 Claude 桌面。

要使用调试器,请运行:

mcp dev src/server.py

打开检查器并超时:

http://localhost:5173?timeout=120000

要在开发认知时应用新的变化,您需要执行以下操作:

  1. cognee 文件夹中的poetry lock
  2. uv sync --dev --all-extras --reinstall
  3. mcp dev src/server.py

发展

为了使用本地 cognee 构建,请在 cognee repo 的根目录中运行:

poetry build -o ./cognee-mcp/sources

构建过程完成后,将cognee-mcp/pyproject.toml中的 cognee 库依赖项从

cognee[postgres,codegraph,gemini,huggingface]==0.1.38

cognee[postgres,codegraph,gemini,huggingface]

之后将以下代码片段添加到同一个文件( cognee-mcp/pyproject.toml )。

[tool.uv.sources] cognee = { path = "sources/cognee-0.1.38-py3-none-any.whl" }

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

人工智能应用程序和代理的内存管理器,使用各种图形和矢量存储,并允许从 30 多个数据源提取数据

  1. 手动安装
    1. MCP 服务器项目
      1. 通过 Smithery 安装
      2. 发展

    Related MCP Servers

    • A
      security
      A
      license
      A
      quality
      Memory Bank Server provides a set of tools and resources for AI assistants to interact with Memory Banks. Memory Banks are structured repositories of information that help maintain context and track progress across multiple sessions.
      Last updated -
      15
      111
      13
      TypeScript
      MIT License
    • A
      security
      A
      license
      A
      quality
      A flexible memory system for AI applications that supports multiple LLM providers and can be used either as an MCP server or as a direct library integration, enabling autonomous memory management without explicit commands.
      Last updated -
      3
      211
      51
      JavaScript
      MIT License
    • -
      security
      A
      license
      -
      quality
      Allows AI models to interact with SourceSync.ai's knowledge management platform to organize, ingest, retrieve, and search content in knowledge bases.
      Last updated -
      14
      1
      TypeScript
      MIT License
      • Apple
      • Linux
    • -
      security
      F
      license
      -
      quality
      Implements long-term memory capabilities for AI assistants using PostgreSQL with pgvector for efficient vector similarity search, enabling semantic retrieval of stored information.
      Last updated -
      1
      JavaScript
      • Apple
      • Linux

    View all related MCP servers

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server