<p align="center">
<a href="https://github.com/Vigtu/docmole">
<img loading="lazy" alt="docmole" src="https://raw.githubusercontent.com/Vigtu/docmole/main/assets/docmole-hero.svg" width="100%"/>
</a>
</p>
# Docmole
<p align="center">
<em>Dig through any documentation with AI</em>
</p>
[](https://www.npmjs.com/package/docmole)
[](https://opensource.org/licenses/MIT)
[](https://bun.sh)
[](https://modelcontextprotocol.io)
Docmole is an MCP server that lets you query **any documentation site** from AI assistants like Claude, Cursor, or any MCP-compatible client. The mole digs through docs so you don't have to.
## Features
* π **Universal docs support** β works with any documentation site
* π **Self-hosted RAG** β LanceDB vectors + OpenAI embeddings, no Python needed
* β‘ **Zero-setup mode** β instant access to Mintlify-powered sites
* π§ **Multi-turn conversations** β remembers context across questions
* π **WebFetch compatible** β links converted to absolute URLs
* π **MCP native** β works with Claude, Cursor, and any MCP client
### Coming soon
* π¦ **Ollama support** β fully local mode, no API keys needed
* π **Generic HTML extraction** β support for non-Mintlify documentation sites
* π **Incremental updates** β only re-index changed pages
## Installation
To use Docmole, run it directly with bunx (no install needed):
```bash
bunx docmole --help
```
Or install globally:
```bash
bun install -g docmole
```
Works on macOS, Linux and Windows. Requires [Bun](https://bun.sh) runtime.
## Getting started
### Local RAG Mode (any docs site)
Index and query any documentation site. Requires `OPENAI_API_KEY`.
```bash
# One-time setup β discovers pages and builds vector index
bunx docmole setup --url https://docs.example.com --id my-docs
# Start the MCP server
bunx docmole serve --project my-docs
```
Add to your MCP client:
```json
{
"mcpServers": {
"my-docs": {
"command": "bunx",
"args": ["docmole", "serve", "--project", "my-docs"]
}
}
}
```
### Mintlify Mode (zero setup)
For sites with [Mintlify AI Assistant](https://mintlify.com) β no API key needed:
```bash
bunx docmole -p agno-v2
```
```json
{
"mcpServers": {
"agno-docs": {
"command": "bunx",
"args": ["docmole", "-p", "agno-v2"]
}
}
}
```
## CLI
Docmole has a built-in CLI for all operations:
```bash
# Mintlify mode (proxy to Mintlify API)
docmole -p <project-id>
# Local RAG mode
docmole setup --url <docs-url> --id <project-id>
docmole serve --project <project-id>
docmole list
docmole stop --project <project-id>
```
Run `docmole --help` for all options.
## How it works
```
βββββββββββββββ βββββββββββββββ ββββββββββββββββββββββββ
β MCP Client ββββββΆβ Docmole ββββββΆβ Embedded: LanceDB β
β (Claude, βββββββ MCP Server βββββββ Mintlify: API proxy β
β Cursor...) β βββββββββββββββ ββββββββββββββββββββββββ
βββββββββββββββ
```
**Local RAG Mode**: Crawls documentation, generates embeddings with OpenAI, stores in LanceDB. Hybrid search combines semantic and keyword matching.
**Mintlify Mode**: Proxies requests to Mintlify's AI Assistant API. Zero setup, instant results.
## Known Mintlify Project IDs
| Documentation | Project ID |
|--------------|------------|
| [Agno](https://docs.agno.com) | `agno-v2` |
| [Resend](https://resend.com/docs) | `resend` |
| [Mintlify](https://mintlify.com/docs) | `mintlify` |
| [Vercel](https://vercel.com/docs) | `vercel` |
| [Upstash](https://upstash.com/docs) | `upstash` |
| [Plain](https://plain.com/docs) | `plain` |
> **Find more**: Open DevTools β Network tab β use the AI assistant β look for `leaves.mintlify.com/api/assistant/{project-id}/message`
## Configuration
| Environment Variable | Default | Description |
|---------------------|---------|-------------|
| `OPENAI_API_KEY` | β | Required for local RAG mode |
| `DOCMOLE_DATA_DIR` | `~/.docmole` | Data directory for projects |
### Project structure
```
~/.docmole/
βββ projects/
β βββ <project-id>/
β βββ config.yaml # Project configuration
β βββ lancedb/ # Vector database
βββ global.yaml # Global settings
```
## Documentation
See [AGENT.md](./AGENT.md) for detailed documentation including:
- Architecture details
- Backend implementations
- Enterprise deployment guides
## Contributing
PRs welcome! See the [contributing guide](./CONTRIBUTING.md) for details.
## Acknowledgments
- [Mintlify](https://mintlify.com) for amazing documentation tooling
- [Anthropic](https://anthropic.com) for Claude and the MCP protocol
- [LanceDB](https://lancedb.com) for the vector database
## License
The Docmole codebase is under [MIT license](./LICENSE).