MCP LLMS-TXT Documentation Server
Overview
llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.
MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple  fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.
llms-txt
You can find llms.txt files for langgraph and langchain here:
| Library | llms.txt | 
| LangGraph Python | |
| LangGraph JS | |
| LangChain Python | |
| LangChain JS | 
Quickstart
Install uv
- Please see official uv docs for other ways to install - uv.
Choose an llms.txt file to use.
- For example, here's the LangGraph - llms.txtfile.
Note: Security and Domain Access Control
For security reasons, mcpdoc implements strict domain access controls:
Remote llms.txt files: When you specify a remote llms.txt URL (e.g.,
https://langchain-ai.github.io/langgraph/llms.txt), mcpdoc automatically adds only that specific domain (langchain-ai.github.io) to the allowed domains list. This means the tool can only fetch documentation from URLs on that domain.
Local llms.txt files: When using a local file, NO domains are automatically added to the allowed list. You MUST explicitly specify which domains to allow using the
--allowed-domainsparameter.
Adding additional domains: To allow fetching from domains beyond those automatically included:
Use
--allowed-domains domain1.com domain2.comto add specific domains
Use
--allowed-domains '*'to allow all domains (use with caution)This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.
(Optional) Test the MCP server locally with your llms.txt file(s) of choice:
- This should run at: http://localhost:8082 
- Run MCP inspector and connect to the running server: 
- Here, you can test the - toolcalls.
Connect to Cursor
- Open - Cursor Settingsand- MCPtab.
- This will open the - ~/.cursor/mcp.jsonfile.
- Paste the following into the file (we use the - langgraph-docs-mcpname and link to the LangGraph- llms.txt).
- Confirm that the server is running in your - Cursor Settings/MCPtab.
- Best practice is to then update Cursor Global (User) rules. 
- Open Cursor - Settings/Rulesand update- User Ruleswith the following (or similar):
- CMD+L(on Mac) to open chat.
- Ensure - agentis selected.
Then, try an example prompt, such as:
Connect to Windsurf
- Open Cascade with - CMD+L(on Mac).
- Click - Configure MCPto open the config file,- ~/.codeium/windsurf/mcp_config.json.
- Update with - langgraph-docs-mcpas noted above.
- Update - Windsurf Rules/Global ruleswith the following (or similar):
Then, try the example prompt:
- It will perform your tool calls. 
Connect to Claude Desktop
- Open - Settings/Developerto update- ~/Library/Application\ Support/Claude/claude_desktop_config.json.
- Update with - langgraph-docs-mcpas noted above.
- Restart Claude Desktop app. 
[!Note] If you run into issues with Python version incompatibility when trying to add MCPDoc tools to Claude Desktop, you can explicitly specify the filepath to
pythonexecutable in theuvxcommand.{ "mcpServers": { "langgraph-docs-mcp": { "command": "uvx", "args": [ "--python", "/path/to/python", "--from", "mcpdoc", "mcpdoc", "--urls", "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt", "--transport", "stdio" ] } } }
[!Note] Currently (3/21/25) it appears that Claude Desktop does not support
rulesfor global rules, so appending the following to your prompt.
- You will see your tools visible in the bottom right of your chat input. 
Then, try the example prompt:
- It will ask to approve tool calls as it processes your request. 
Connect to Claude Code
- In a terminal after installing Claude Code, run this command to add the MCP server to your project: 
- You will see - ~/.claude.jsonupdated.
- Test by launching Claude Code and running to view your tools: 
[!Note] Currently (3/21/25) it appears that Claude Code does not support
rulesfor global rules, so appending the following to your prompt.
Then, try the example prompt:
- It will ask to approve tool calls. 
Command-line Interface
The mcpdoc command provides a simple CLI for launching the documentation server.
You can specify documentation sources in three ways, and these can be combined:
- Using a YAML config file: 
- This will load the LangGraph Python documentation from the - sample_config.yamlfile in this repo.
- Using a JSON config file: 
- This will load the LangGraph Python documentation from the - sample_config.jsonfile in this repo.
- Directly specifying llms.txt URLs with optional names: 
- URLs can be specified either as plain URLs or with optional names using the format - name:url.
- You can specify multiple URLs by using the - --urlsparameter multiple times.
- This is how we loaded - llms.txtfor the MCP server above.
You can also combine these methods to merge documentation sources:
Additional Options
- --follow-redirects: Follow HTTP redirects (defaults to False)
- --timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)
Example with additional options:
This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.
Configuration Format
Both YAML and JSON configuration files should contain a list of documentation sources.
Each source must include an llms_txt URL and can optionally include a name:
YAML Configuration Example (sample_config.yaml)
JSON Configuration Example (sample_config.json)
Programmatic Usage
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
An MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.
- Overview
- llms-txt
- Quickstart
- Command-line Interface
- Additional Options
- Configuration Format
- Programmatic Usage
Related Resources
Related MCP Servers
- -security-license-qualityA Model Context Protocol (MCP) server that enables LLMs to interact directly the documents that they have on-disk through agentic RAG and hybrid search in LanceDB. Ask LLMs questions about the dataset as a whole or about specific documents.Last updated -572MIT License
- Asecurity-licenseAqualityAn MCP server that provides LLMs access to other LLMsLast updated -1667MIT License
- -security-license-qualityAn open-source MCP server that provides applications like Cursor, Windsurf, and Claude with access to llms.txt documentation files, allowing users to control and audit context retrieval.Last updated -8MIT License
- Asecurity-licenseAqualityAn MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.Last updated -12