Skip to main content
Glama

MCP Documentation Server

by esakrissa

MCP 文档服务器

MCP 文档服务器的定制版本,可通过模型上下文协议实现 LLM 应用程序(如 Cursor、Claude Desktop、Windsurf)和文档源之间的集成。

概述

该服务器为 MCP 主机应用程序提供:

  1. 访问特定文档文件(langgraph.txt 和 mcp.txt)
  2. 从这些文件中的 URL 获取文档的工具

支持文档

目前设置为:

快速入门

设置和运行

# Clone the repository git clone https://github.com/esakrissa/mcp-doc.git cd mcp-doc # Create and activate a virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install the package in development mode pip install -e .

运行服务器

您可以使用已安装的命令运行服务器:

# Run the server with the config file mcpdoc \ --json config.json \ --transport sse \ --port 8082 \ --host localhost

或者如果您更喜欢使用 UV:

# Install uv (if not already installed) curl -LsSf https://astral.sh/uv/install.sh | sh # Run the server with UV uvx --from mcpdoc mcpdoc \ --json config.json \ --transport sse \ --port 8082 \ --host localhost

IDE 集成

光标

添加到~/.cursor/mcp.json

{ "mcpServers": { "mcp-doc": { "command": "uvx", "args": [ "--from", "mcpdoc", "mcpdoc", "--urls", "LangGraph:https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/langgraph.txt", "ModelContextProtocol:https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/mcp.txt", "--allowed-domains", "*", "--transport", "stdio" ] } } }

然后将这些指令添加到 Cursor 的自定义指令中:

for ANY question about LangGraph and Model Context Protocol (MCP), use the mcp-doc server to help answer -- + call list_doc_sources tool to get the available documentation files + call fetch_docs tool to read the langgraph.txt or mcp.txt file + reflect on the urls in langgraph.txt or mcp.txt + reflect on the input question + call fetch_docs on any urls relevant to the question + use this to answer the question

要测试集成是否有效,请向 Cursor 询问有关 LangGraph 或 MCP 的问题,并检查它是否使用文档服务器工具来获取信息。

安全说明

出于安全原因,实施了严格的域访问控制:

  • 远程文档文件:仅自动允许特定域
  • 本地文档文件:不自动允许任何域名
  • 使用--allowed-domains明确添加域或--allowed-domains '*'允许所有域(谨慎使用)

参考

该项目基于LangChain AI 的原始 mcpdoc ,进行了修改,为 LangGraph 和 MCP 提供集中的文档访问。

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

定制的 MCP 服务器,可实现 LLM 应用程序和文档源之间的集成,提供对 LangGraph 和模型上下文协议文档的 AI 辅助访问。

  1. 概述
    1. 支持文档
      1. 快速入门
        1. 设置和运行
        2. 运行服务器
        3. IDE 集成
      2. 安全说明
        1. 参考

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            An MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.
            Last updated -
            177
            Python
            MIT License
            • Apple
          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol (MCP) compliant server that allows Large Language Models (LLMs) to search and retrieve content from microCMS APIs.
            Last updated -
            TypeScript
            MIT License
          • -
            security
            F
            license
            -
            quality
            A Model Context Protocol server implementation that enables seamless integration with Claude and other MCP-compatible clients to access Prem AI's language models, RAG capabilities, and document management features.
            Last updated -
            JavaScript
          • A
            security
            A
            license
            A
            quality
            The APISIX Model Context Protocol (MCP) server bridges large language models (LLMs) with the APISIX Admin API.
            Last updated -
            31
            67
            16
            TypeScript
            Apache 2.0

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/esakrissa/mcp-doc'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server