Runs as a Node.js application with built-in support for scraping, indexing and searching Node.js packages and libraries.
Uses OpenAI's embedding capabilities to generate vector embeddings for documentation chunks, enabling semantic searching of documentation content.
Enables scraping, indexing, and searching React documentation with version-specific support, allowing users to search across different React versions.
Uses semantic-release to automate the release process based on commit messages, handling version bumping and changelog generation.
Supports scraping and indexing SemVer documentation, with examples showing how to fetch and search through the node-semver documentation.
Leverages SQLite with sqlite-vec for efficient vector similarity search and FTS5 for full-text search capabilities when indexing documentation.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@docs-mcp-serversearch for React hooks documentation, version 18.2.0"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Grounded Docs: Your AI's Up-to-Date Documentation Expert
Docs MCP Server solves the problem of AI hallucinations and outdated knowledge by providing a personal, always-current documentation index for your AI coding assistant. It fetches official docs from websites, GitHub, npm, PyPI, and local files, allowing your AI to query the exact version you are using.

β¨ Why Grounded Docs MCP Server?
The open-source alternative to Context7, Nia, and Ref.Tools.
β Up-to-Date Context: Fetches documentation directly from official sources on demand.
π― Version-Specific: Queries target the exact library versions in your project.
π‘ Reduces Hallucinations: Grounds LLMs in real documentation.
π Private & Local: Runs entirely on your machine; your code never leaves your network.
π§© Broad Compatibility: Works with any MCP-compatible client (Claude, Cline, etc.).
π Multiple Sources: Index websites, GitHub repositories, local folders, and zip archives.
π Rich File Support: Processes HTML, Markdown, PDF, Word (.docx), Excel, PowerPoint, and source code.
Related MCP server: mcp-hn
π Quick Start
1. Start the server (requires Node.js 22+):
npx @arabold/docs-mcp-server@latest2. Open the Web UI at http://localhost:6280 to add documentation.
3. Connect your AI client by adding this to your MCP settings (e.g., claude_desktop_config.json):
{
"mcpServers": {
"docs-mcp-server": {
"type": "sse",
"url": "http://localhost:6280/sse"
}
}
}See Connecting Clients for VS Code (Cline, Roo) and other setup options.
docker run --rm \
-v docs-mcp-data:/data \
-v docs-mcp-config:/config \
-p 6280:6280 \
ghcr.io/arabold/docs-mcp-server:latest \
--protocol http --host 0.0.0.0 --port 6280π§ Configure Embedding Model (Recommended)
Using an embedding model is optional but dramatically improves search quality by enabling semantic vector search.
Example: Enable OpenAI Embeddings
OPENAI_API_KEY="sk-proj-..." npx @arabold/docs-mcp-server@latestSee Embedding Models for configuring Ollama, Gemini, Azure, and others.
π Documentation
Getting Started
Installation: Detailed setup guides for Docker, Node.js (npx), and Embedded mode.
Connecting Clients: How to connect Claude, VS Code (Cline/Roo), and other MCP clients.
Basic Usage: Using the Web UI, CLI, and scraping local files.
Configuration: Full reference for config files and environment variables.
Embedding Models: Configure OpenAI, Ollama, Gemini, and other providers.
Key Concepts & Architecture
Deployment Modes: Standalone vs. Distributed (Docker Compose).
Authentication: Securing your server with OAuth2/OIDC.
Telemetry: Privacy-first usage data collection.
Architecture: Deep dive into the system design.
π€ Contributing
We welcome contributions! Please see CONTRIBUTING.md for development guidelines and setup instructions.
License
This project is licensed under the MIT License. See LICENSE for details.