Provides tools for searching and retrieving content from Markdown-based documentation, allowing AI agents to access project guides and reference materials.
Supports indexing and retrieval of content from MDX files, enabling AI agents to interact with documentation projects that use extended Markdown formats.
Facilitates the discovery and retrieval of OpenAPI and AsyncAPI specifications stored in YAML format for documentation purposes.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@EasyPeasyMCPsearch the documentation for how to set up authentication"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
EasyPeasyMCP
A lightweight, zero-config MCP server for documentation projects.
Give it an llms-full.txt file (local path or URL) and optional OpenAPI/AsyncAPI directories. It also hellps you to build one if you do not have it. It registers only the MCP tools that make sense for what you've provided — no code changes, no hard-coded paths.
Table of Contents
Why it's different
No RAG, no vector database, no embedding pipeline. Search is a case-insensitive line scan with configurable context — all in-process, in memory. For small projects with well-structured content like
llms-full.txt, this is all you need to get started — no infrastructure, no ops burden, easy to pitch internally. The entire search capability is ~25 lines of vanilla JS with zero runtime dependencies.Any project with an
llms-full.txtis MCP-enabled in 30 seconds. PointllmsTxtat a hosted URL and you're done — no local file sync, no pipeline. Docs update, the AI gets fresh content automatically. It's the adoption curve that matters: the llms.txt standard is becoming the norm for docs sites, and this tool makes every one of them instantly AI-accessible.Don't have an
llms-full.txtyet? No problem — as long as you have Markdown files, the bundledeasy-peasy-buildCLI will generate one for you from your docs and specs.Conditional tool registration keeps the AI's context clean. No OpenAPI directory? No
list_openapi_specstool. Tools only appear when the content exists — the MCP surface matches exactly what you've provided.
When to use this — and when not to
This is a speed-first tool. Use it when you need an agent to access new knowledge in minutes, not days — a quick proof of concept, a personal workflow, a demo, or an early internal pilot where getting something working fast matters more than getting it perfect.
For professional, long-term setups shared across teams, you will eventually want a proper chunk → embed → RAG pipeline instead. That gives you semantic search (the agent finds meaning, not just matching words), much lower token consumption per query, and the ability to scale across large or frequently updated knowledge bases without loading everything into memory. This tool loads the full content on every startup — that's fine for a few hundred KB, but it's a ceiling, not a foundation.
No docs at all? Not even Markdown files? If you're in a real hurry, just ask the agent to scrape the developer portal you depend on — it can crawl the relevant pages and pull the content together. It can even check common locations for OpenAPI or AsyncAPI specs and fetch those too. Combine that with easy-peasy-build and you have a working MCP server in minutes, with zero local files to maintain.
The honest summary: use this to validate that AI-assisted documentation is worth investing in. Once it is, graduate to a proper RAG stack.
How it works
What you provide | Tools registered |
|
|
OpenAPI directory |
|
AsyncAPI directory |
|
search_documentation covers all loaded content (llms-full.txt + all specs).
Quick start
Drop an .easypeasymcp.json (or .easypeasymcp.yaml) in your docs project root:
JSON:
{
"name": "my-project",
"llmsTxt": "./llms-full.txt",
"openapi": "./openapi",
"asyncapi": "./asyncapi",
"build": {
"docs": ["./guides", "./api-reference"]
}
}YAML:
name: my-project
llmsTxt: ./llms-full.txt
openapi: ./openapi
asyncapi: ./asyncapi
build:
docs:
- ./guides
- ./api-referencePaths are relative to the config file. Omit any key you don't have.
llmsTxt can also be a URL. The build section is optional — include it if you want the server to regenerate llms-full.txt on every startup (add --rebuild to the command below).
Registration requires absolute path to config file (paths inside the config are relative to it):
# Use absolute path
claude mcp add my-project npx easy-peasy-mcp@0.0.11 \
-- --rebuild --config /absolute/path/to/.easypeasymcp.json
# Or convert relative to absolute with shell expansion
claude mcp add my-project npx easy-peasy-mcp@0.0.11 \
-- --rebuild --config $(pwd)/.easypeasymcp.jsonNo config file needed — pass everything directly. Works with URLs too:
claude mcp add asyncapi npx easy-peasy-mcp@0.0.11 -- \
--name "asyncapi" \
--llms https://raw.githubusercontent.com/derberg/EasyPeasyMCP/refs/heads/main/example-llms/asyncapi.txtGenerating llms-full.txt
gitingest.com generates a single combined text file from any public repo or website. Good for a one-off grab when you don't need the file to stay in sync with updates.
For local Markdown files + OpenAPI/AsyncAPI specs:
npx --package=easy-peasy-mcp@0.0.11 easy-peasy-build \
--docs ./guides \
--docs ./api-reference \
--openapi ./openapi \
--asyncapi ./asyncapi \
--output ./llms-full.txt--docsis repeatable for multiple source directoriesReads
.mdand.mdxfiles recursively, sorted by nameOpenAPI/AsyncAPI files are included as code-fenced blocks
Omit
--outputto print to stdout
To keep llms-full.txt fresh automatically, add a build section to .easypeasymcp.json and pass --rebuild when registering the MCP server — it will regenerate on every startup instead of needing a manual run.
Configuration reference
easy-peasy-mcp (MCP server)
CLI flag | Config key | Description |
| — | Path to |
|
| Server name, shown in MCP client and embedded in tool descriptions. Defaults to |
|
| Path or URL to |
|
| Path to a directory of OpenAPI specs (JSON/YAML). Registers |
|
| Path to a directory of AsyncAPI specs (JSON/YAML). Registers |
|
| Rebuild |
| — | Enable debug logging to stderr. Useful for troubleshooting search issues or verifying content is loaded correctly. |
Config file paths are resolved relative to the config file's location. At least one of --llms, --openapi, or --asyncapi is required.
build config section
Optional. When present, add --rebuild to the claude mcp add command and the server will regenerate llms-full.txt on every startup.
{
"name": "my-project",
"llmsTxt": "./llms-full.txt",
"openapi": "./openapi",
"build": {
"docs": ["./guides", "./api-reference"],
"title": "My Project"
}
}openapi and asyncapi from the top level are reused automatically. llmsTxt is the output path.
easy-peasy-build (llms-full.txt generator)
CLI flag | Description |
| Markdown source directory. Repeatable for multiple directories. |
| OpenAPI spec directory. Files included as code-fenced blocks. |
| AsyncAPI spec directory. Files included as code-fenced blocks. |
| Project title for the generated file header. |
| Output file path. Omit to print to stdout. |
Local debugging
Use the MCP Inspector to interactively test the server:
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--config /path/to/.easypeasymcp.jsonnpx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--llms /path/to/llms-full.txt \
--openapi /path/to/openapiTo try it right now without any local files:
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--llms https://raw.githubusercontent.com/derberg/EasyPeasyMCP/refs/heads/main/example-llms/asyncapi.txtTip: Add --debug to see detailed logging about content loading and search operations:
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--config /path/to/.easypeasymcp.json \
--debugDebug logs appear in the Server Notifications view in the MCP Inspector UI.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.