MCP LLMS-TXT Documentation Server
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Integrations
MCP LLMS-TXT Documentation Server
Overview
llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt
to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt
. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.
MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt
files and (2) a simple fetch_docs
tool read URLs within any of the provided llms.txt
files. This allows the user to audit each tool call as well as the context returned.
Quickstart
Install uv
- Please see official uv docs for other ways to install
uv
.
Choose an llms.txt
file to use.
- For example, here's the LangGraph
llms.txt
file.
(Optional) Test the MCP server locally with your llms.txt
file of choice:
- This should run at: http://localhost:8082
- Run MCP inspector and connect to the running server:
- Here, you can test the
tool
calls.
Connect to Cursor
- Open
Cursor Settings
andMCP
tab. - This will open the
~/.cursor/mcp.json
file.
- Paste the following into the file (we use the
langgraph-docs-mcp
name and link to the LangGraphllms.txt
).
- Confirm that the server is running in your
Cursor Settings/MCP
tab. - Best practice is to then update Cursor Global (User) rules.
- Open Cursor
Settings/Rules
and updateUser Rules
with the following (or similar):
CMD+L
(on Mac) to open chat.- Ensure
agent
is selected.
Then, try an example prompt, such as:
Connect to Windsurf
- Open Cascade with
CMD+L
(on Mac). - Click
Configure MCP
to open the config file,~/.codeium/windsurf/mcp_config.json
. - Update with
langgraph-docs-mcp
as noted above.
- Update
Windsurf Rules/Global rules
with the following (or similar):
Then, try the example prompt:
- It will perform your tool calls.
Connect to Claude Desktop
- Open
Settings/Developer
to update~/Library/Application\ Support/Claude/claude_desktop_config.json
. - Update with
langgraph-docs-mcp
as noted above. - Restart Claude Desktop app.
Note: currently (3/21/25) it appears that Claude Code does not support
rules
for global rules, so appending the following to your prompt.
- You will see your tools visible in the bottom right of your chat input.
Then, try the example prompt:
- It will ask to approve tool calls as it processes your request.
Connect to Claude Code
- In a terminal after installing Claude Code, run this command to add the MCP server to your project:
- You will see
~/.claude.json
updated. - Test by launching Claude Code and running to view your tools:
Note: currently (3/21/25) it appears that Claude Code does not support
rules
for global rules, so appending the following to your prompt.
Then, try the example prompt:
- It will ask to approve tool calls.
Command-line Interface
The mcpdoc
command provides a simple CLI for launching the documentation server.
You can specify documentation sources in three ways, and these can be combined:
- Using a YAML config file:
- This will load the LangGraph Python documentation from the
sample_config.yaml
file in this repo.
- Using a JSON config file:
- This will load the LangGraph Python documentation from the
sample_config.json
file in this repo.
- Directly specifying llms.txt URLs with optional names:
- URLs can be specified either as plain URLs or with optional names using the format
name:url
. - This is how we loaded
llms.txt
for the MCP server above.
You can also combine these methods to merge documentation sources:
Additional Options
--follow-redirects
: Follow HTTP redirects (defaults to False)--timeout SECONDS
: HTTP request timeout in seconds (defaults to 10.0)
Example with additional options:
This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.
Configuration Format
Both YAML and JSON configuration files should contain a list of documentation sources.
Each source must include an llms_txt
URL and can optionally include a name
:
YAML Configuration Example (sample_config.yaml)
JSON Configuration Example (sample_config.json)
Programmatic Usage
This server cannot be installed
An MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.