Allows querying of any Mintlify-powered documentation site to ask technical questions, retrieve code examples, and get explanations through an AI-powered assistant.
Enables querying of Resend's documentation to answer technical questions and retrieve code snippets directly through the Mintlify assistant interface.
Enables querying of Upstash's documentation to answer technical questions and retrieve code snippets directly through the Mintlify assistant interface.
Enables querying of Vercel's documentation to answer technical questions and retrieve code snippets directly through the Mintlify assistant interface.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@mintlify-mcpHow do I send an email with an attachment using Resend?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Docmole
Docmole is an MCP server that lets you query any documentation site from AI assistants like Claude, Cursor, or any MCP-compatible client. The mole digs through docs so you don't have to.
Features
π Universal docs support β works with any documentation site
π Self-hosted RAG β LanceDB vectors + OpenAI embeddings, no Python needed
β‘ Zero-setup mode β instant access to Mintlify-powered sites
π§ Multi-turn conversations β remembers context across questions
π WebFetch compatible β links converted to absolute URLs
π MCP native β works with Claude, Cursor, and any MCP client
Coming soon
π¦ Ollama support β fully local mode, no API keys needed
π Generic HTML extraction β support for non-Mintlify documentation sites
π Incremental updates β only re-index changed pages
Installation
To use Docmole, run it directly with bunx (no install needed):
Or install globally:
Works on macOS, Linux and Windows. Requires Bun runtime.
Getting started
Local RAG Mode (any docs site)
Index and query any documentation site. Requires OPENAI_API_KEY.
Add to your MCP client:
Mintlify Mode (zero setup)
For sites with Mintlify AI Assistant β no API key needed:
CLI
Docmole has a built-in CLI for all operations:
Run docmole --help for all options.
How it works
Local RAG Mode: Crawls documentation, generates embeddings with OpenAI, stores in LanceDB. Hybrid search combines semantic and keyword matching.
Mintlify Mode: Proxies requests to Mintlify's AI Assistant API. Zero setup, instant results.
Known Mintlify Project IDs
Find more: Open DevTools β Network tab β use the AI assistant β look for
leaves.mintlify.com/api/assistant/{project-id}/message
Configuration
Environment Variable | Default | Description |
| β | Required for local RAG mode |
|
| Data directory for projects |
Project structure
Documentation
See AGENT.md for detailed documentation including:
Architecture details
Backend implementations
Enterprise deployment guides
Contributing
PRs welcome! See the contributing guide for details.
Acknowledgments
Mintlify for amazing documentation tooling
Anthropic for Claude and the MCP protocol
LanceDB for the vector database
License
The Docmole codebase is under MIT license.