The Docfork server provides up-to-date documentation and code examples for over 9000 code libraries through its get-library-docs tool.
Retrieve Documentation: Fetch comprehensive documentation for any library by specifying the author/library name (e.g., "vercel/next.js", "shadcn-ui/ui")
Focus on Topics: Target specific topics within a library (e.g., "routing", "authentication", "hooks") to get relevant documentation and code examples
Control Output Size: Limit response length by specifying maximum token count to manage context size
Automatic Library Selection: Intelligently finds and selects the most relevant library based on the provided name
AI Integration: Integrates with various AI code editors and clients (Cursor, Claude, VS Code, JetBrains AI Assistant) via the Model Context Protocol (MCP)
The server returns detailed documentation with code examples directly from source, along with an explanation of the library selection process.
Provides alternative installation method using Bun package manager for users experiencing issues with NPX
Offers alternative runtime environment for running the Docfork MCP server when users encounter bundler issues
Supports containerized deployment of the Docfork MCP server with a provided Dockerfile configuration
Provides up-to-date documentation and examples for Next.js development, specifically mentioned in usage examples
Requires Node.js ≥ v18 as runtime environment, with native fetch API support
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Docforkshow me the latest React useState hook documentation"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.

Docfork MCP - Up-to-date Docs for AI Agents.
Stop hallucinations, context bloat, and outdated APIs.
Professional developers use Docfork to enforce context isolation with project-specific Cabinets. By hard-locking your LLM to a verified stack, you ensure deterministic accuracy and minimal token overhead, eliminating the latency and context bloat of general-purpose indexes.
⚡ The Docfork Difference
Other context MCPs treat docs like a general search engine; Docfork treats it like a deterministic build artifact:
✅ Context Isolation: Use Cabinets to hard-lock your agent to a verified stack (e.g.
Next.js+Better Auth) to stop context poisoning from unwanted libraries.✅ SOTA Index: 10,000+ libraries, pre-chunked and ready. ~200ms global edge-cached retrieval for Markdown docs & code snippets.
✅ Team-First: Standardize your organization's context with API keys & Cabinets so every engineer—and agent—is on the same page.
Related MCP server: PAELLADOC
🚀 Quick Start
1. Get your Free API Key
Grab a free key at docfork.com.
Free Tier: 1,000 requests/month (per org).
Team: 5 free seats included per org.
Pro Tier & Private Docs: Coming soon 🚀
2. Install MCP
Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.
Since Cursor 1.0, you can click the install buttons below for instant one-click installation.
Cursor Remote Server Connection
Cursor Local Server Connection
Run this command. See Claude Code MCP docs for more info.
Claude Code Local Server Connection
Claude Code Remote Server Connection
Add this to your OpenCode configuration file. See OpenCode MCP docs for more info.
OpenCode Remote Server Connection
OpenCode Local Server Connection
See Setup for Windsurf, Roo Code, and 40+ others →
Docfork supports MCP OAuth specs. Change your endpoint to use OAuth:
Note: OAuth is for remote HTTP connections only.
3. Use Docfork
Tell your AI to fetch specific, version-accurate documentation for your stack:
4. Add a rule to auto-invoke Docfork MCP
Don't want to type use docfork every time? Add a rule to make your AI fetch docs automatically.
Once enabled, your AI will automatically fetch the latest docs when you ask questions like:
🔨 Available Tools
Tool | Purpose |
| Context-Aware Search. Respects your |
| The Deep Dive. Fetches full Markdown content from a search result URL when the snippet isn't enough. |
📖 Documentation
Installation Guides – Comprehensive setup for all IDEs.
Cabinets – Context isolation and project-scoped documentation.
Library Identifiers – Improve accuracy with
owner/repotargeting.Troubleshooting – Fix connection or auth issues.
💬 Connect with Us
Official Changelog – We are constantly shipping!
X (Twitter) – Follow for latest updates.
Found an issue? Raise a GitHub issue or contact support.
Star History
Disclaimer
Docfork is an open, community-driven catalogue. While we review submissions, we cannot guarantee accuracy for every project listed. If you spot an issue, raise a GitHub issue or contact support.
License
MIT