Skip to main content
Glama

MCP Documentation Server

by esakrissa

MCP Documentation Server

A customized version of the MCP documentation server that enables integration between LLM applications (like Cursor, Claude Desktop, Windsurf) and documentation sources via the Model Context Protocol.

Overview

This server provides MCP host applications with:

  1. Access to specific documentation files (langgraph.txt and mcp.txt)
  2. Tools to fetch documentation from URLs within those files

Supported Documentation

Currently set up for:

  • LangGraph Documentation (from https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/langgraph.txt)
  • MCP Documentation (from https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/mcp.txt)

Quick Start

Setup and Run

# Clone the repository git clone https://github.com/esakrissa/mcp-doc.git cd mcp-doc # Create and activate a virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install the package in development mode pip install -e .

Running the Server

You can run the server using the installed command:

# Run the server with the config file mcpdoc \ --json config.json \ --transport sse \ --port 8082 \ --host localhost

Or if you prefer using UV:

# Install uv (if not already installed) curl -LsSf https://astral.sh/uv/install.sh | sh # Run the server with UV uvx --from mcpdoc mcpdoc \ --json config.json \ --transport sse \ --port 8082 \ --host localhost

IDE Integration

Cursor

Add to ~/.cursor/mcp.json

{ "mcpServers": { "mcp-doc": { "command": "uvx", "args": [ "--from", "mcpdoc", "mcpdoc", "--urls", "LangGraph:https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/langgraph.txt", "ModelContextProtocol:https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/mcp.txt", "--allowed-domains", "*", "--transport", "stdio" ] } } }

Then add these instructions to Cursor's Custom Instructions:

for ANY question about LangGraph and Model Context Protocol (MCP), use the mcp-doc server to help answer -- + call list_doc_sources tool to get the available documentation files + call fetch_docs tool to read the langgraph.txt or mcp.txt file + reflect on the urls in langgraph.txt or mcp.txt + reflect on the input question + call fetch_docs on any urls relevant to the question + use this to answer the question

To test if the integration is working, ask Cursor a question about LangGraph or MCP, and check if it uses the documentation server tools to fetch information.

Security Note

For security reasons, strict domain access controls are implemented:

  • Remote documentation files: Only the specific domain is automatically allowed
  • Local documentation files: No domains are automatically allowed
  • Use --allowed-domains to explicitly add domains or --allowed-domains '*' to allow all (use with caution)

References

This project is based on the original mcpdoc by LangChain AI, modified to provide focused documentation access for LangGraph and MCP.

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A customized MCP server that enables integration between LLM applications and documentation sources, providing AI-assisted access to LangGraph and Model Context Protocol documentation.

  1. Overview
    1. Supported Documentation
      1. Quick Start
        1. Setup and Run
        2. Running the Server
        3. IDE Integration
      2. Security Note
        1. References

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol (MCP) server that enables LLMs to interact directly with MongoDB databases. Query collections, inspect schemas, and manage data seamlessly through natural language.
            Last updated -
            340
            75
            TypeScript
            MIT License
            • Apple
          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol (MCP) server that enables LLMs to interact directly the documents that they have on-disk through agentic RAG and hybrid search in LanceDB. Ask LLMs questions about the dataset as a whole or about specific documents.
            Last updated -
            12
            31
            TypeScript
            MIT License
            • Apple
          • -
            security
            A
            license
            -
            quality
            An MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.
            Last updated -
            177
            Python
            MIT License
            • Apple
          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol (MCP) server that enables LLMs to interact directly with MongoDB databases, allowing them to query collections, inspect schemas, and manage data seamlessly through natural language.
            Last updated -
            340
            MIT License
            • Apple

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/esakrissa/mcp-doc'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server