The Docfork server provides up-to-date documentation and code examples for over 9000 code libraries through its get-library-docs
tool.
- Retrieve Documentation: Fetch comprehensive documentation for any library by specifying the author/library name (e.g., "vercel/next.js", "shadcn-ui/ui")
- Focus on Topics: Target specific topics within a library (e.g., "routing", "authentication", "hooks") to get relevant documentation and code examples
- Control Output Size: Limit response length by specifying maximum token count to manage context size
- Automatic Library Selection: Intelligently finds and selects the most relevant library based on the provided name
- AI Integration: Integrates with various AI code editors and clients (Cursor, Claude, VS Code, JetBrains AI Assistant) via the Model Context Protocol (MCP)
The server returns detailed documentation with code examples directly from source, along with an explanation of the library selection process.
Provides alternative installation method using Bun package manager for users experiencing issues with NPX
Offers alternative runtime environment for running the Docfork MCP server when users encounter bundler issues
Supports containerized deployment of the Docfork MCP server with a provided Dockerfile configuration
Provides up-to-date documentation and examples for Next.js development, specifically mentioned in usage examples
Requires Node.js ≥ v18 as runtime environment, with native fetch API support
Docfork MCP: 🌿 @latest Docs to 9000+ Code Libraries in a Single Tool Call
❌ The Problem: Expired Knowledge
- Out of date code examples & stale data from year-old model training
- Hallucinated syntax & APIs
- Old or mismatched versions
✅ The Solution: @latest docs at warp speed
- Always in sync with the latest version of docs
- Accurate descriptions and code examples
- Sub-second retrieval results (500ms @ p95) in your AI code editor
Docfork MCP pulls @latest documentation and code examples straight from the source - and adds them right into your context.
Just tell Cursor to use docfork
:
🛠️ Installation
📋 Requirements
- Node.js ≥ v18
- Cursor/Windsurf/Claude Desktop (any MCP client)
Installing via Smithery
To install Docfork MCP Server for any client automatically via Smithery:
You can find your Smithery key in the Smithery.ai webpage.
Go to: Settings
-> Cursor Settings
-> Tools & Integrations
-> Add a custom MCP server
Pasting the following config into your Cursor ~/.cursor/mcp.json
file is the recommended approach. You can also install in a specific project by creating .cursor/mcp.json
in your project folder. See Cursor MCP docs for more info.
Cursor Remote Server Connection
Cursor Local Server Connection
Install in Claude Code
Run this command. See Claude Code MCP docs for more info.
Claude Code Remote Server Connection
Claude Code Local Server Connection
Install in Claude Desktop
Add this to your Claude Desktop claude_desktop_config.json
file. See Claude Desktop MCP docs for more info.
Install in Windsurf
Add this to your Windsurf MCP config. See Windsurf MCP docs for more info.
Windsurf Remote Server Connection
Windsurf Local Server Connection
Install in VS Code
Add this to your VS Code MCP config. See VS Code MCP docs for more info.
VS Code Remote Server Connection
VS Code Local Server Connection
Install in Zed
One-click install: → Get the Docfork Extension
Or Manual config (for power users):
Install in BoltAI
Open the "Settings" page of the app, navigate to "Plugins," and enter the following JSON:
More info is available on BoltAI's Documentation site. For BoltAI on iOS, see this guide.
Using Docker
If you prefer to run the MCP server in a Docker container:
- Build the Docker Image:First, create a
Dockerfile
in the project root (or anywhere you prefer):Then, build the image using a tag (e.g.,docfork-mcp
). Make sure Docker Desktop (or the Docker daemon) is running. Run the following command in the same directory where you saved theDockerfile
: - Configure Your MCP Client:Update your MCP client's configuration to use the Docker command.Example for a cline_mcp_settings.json:Note: This is an example configuration. Please refer to the specific examples for your MCP client (like Cursor, VS Code, etc.) earlier in this README to adapt the structure (e.g.,
mcpServers
vsservers
). Also, ensure the image name inargs
matches the tag used during thedocker build
command.
Install in Windows
The configuration on Windows is slightly different compared to Linux or macOS (Cline
is used in the example). The same principle applies to other editors; refer to the configuration of command
and args
.
Install in Augment Code
To configure Docfork MCP in Augment Code, follow these steps:
- Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
- Select Edit Settings
- Under Advanced, click Edit in settings.json
- Add the server configuration to the
mcpServers
array in theaugment.advanced
object
Once the MCP server is added, restart your editor. If you receive any errors, check the syntax to make sure closing brackets or commas are not missing.
Install in Roo Code
Add this to your Roo Code MCP configuration file. See Roo Code MCP docs for more info.
Roo Code Remote Server Connection
Roo Code Local Server Connection
Use the Add manually feature and fill in the JSON configuration information for that MCP server. For more details, visit the Trae documentation.
Trae Remote Server Connection
Trae Local Server Connection
You can configure Docfork MCP in Visual Studio 2022 by following the Visual Studio MCP Servers documentation.
Add this to your Visual Studio MCP config file (see the Visual Studio docs for details):
Or, for a local server:
For more information and troubleshooting, refer to the Visual Studio MCP Servers documentation.
See Gemini CLI Configuration for details.
- Open the Gemini CLI settings file. The location is
~/.gemini/settings.json
(where~
is your home directory). - Add the following to the
mcpServers
object in yoursettings.json
file:
Or, for a local server:
If the mcpServers
object does not exist, create it.
Add this to your Crush configuration file. See Crush MCP docs for more info.
Crush Remote Server Connection (HTTP)
Crush Remote Server Connection (SSE)
Crush Local Server Connection
You can easily install Docfork through the Cline MCP Server Marketplace by following these instructions:
- Open Cline.
- Click the hamburger menu icon (☰) to enter the MCP Servers section.
- Use the search bar within the Marketplace tab to find Docfork.
- Click the Install button.
To configure Docfork MCP in Zencoder, follow these steps:
- Go to the Zencoder menu (...)
- From the dropdown menu, select Agent tools
- Click on the Add custom MCP
- Add the name and server configuration from below, and make sure to hit the Install button
Once the MCP server is added, you can easily continue using it.
Add this to your Amazon Q Developer CLI configuration file. See Amazon Q Developer CLI docs for more details.
See Qodo Gen docs for more details.
- Open Qodo Gen chat panel in VSCode or IntelliJ.
- Click Connect more tools.
- Click + Add new MCP.
- Add the following configuration:
See JetBrains AI Assistant Documentation for more details.
- In JetBrains IDEs go to
Settings
->Tools
->AI Assistant
->Model Context Protocol (MCP)
- Click
+ Add
. - Click on
Command
in the top-left corner of the dialog and select the As JSON option from the list - Add this configuration and click
OK
- Click
Apply
to save changes. - The same way docfork could be added for JetBrains Junie in
Settings
->Tools
->Junie
->MCP Settings
See Warp Model Context Protocol Documentation for details.
- Navigate
Settings
>AI
>Manage MCP servers
. - Add a new MCP server by clicking the
+ Add
button. - Paste the configuration given below:
- Click
Save
to apply the changes.
Add this to your Opencode configuration file. See Opencode MCP docs docs for more info.
Opencode Remote Server Connection
Opencode Local Server Connection
Using Docfork with Copilot Coding Agent
Add the following configuration to the mcp
section of your Copilot Coding Agent configuration file Repository->Settings->Copilot->Coding agent->MCP configuration:
For more information, see the official GitHub documentation.
See Kiro Model Context Protocol Documentation for details.
- Navigate
Kiro
>MCP Servers
- Add a new MCP server by clicking the
+ Add
button. - Paste the configuration given below:
- Click
Save
to apply the changes.
See OpenAI Codex for more information.
Add the following configuration to your OpenAI Codex MCP server settings:
See LM Studio MCP Support for more information.
One-click install:
Manual set-up:
- Navigate to
Program
(right side) >Install
>Edit mcp.json
. - Paste the configuration given below:
- Click
Save
to apply the changes. - Toggle the MCP server on/off from the right hand side, under
Program
, or by clicking the plug icon at the bottom of the chat box.
See Local and Remote MCPs for Perplexity for more information.
- Navigate
Perplexity
>Settings
- Select
Connectors
. - Click
Add Connector
. - Select
Advanced
. - Enter Server Name:
Docfork
- Paste the following JSON in the text area:
- Click
Save
.
🔨 Available Tools
Docfork MCP provides the following tool that LLMs can use:
get-library-docs
: Searches the library and returns its documentation.libraryName
(required): The name of the library to search fortopic
(required): Focus the docs on a specific topic (e.g., "routing", "hooks")tokens
(optional, default 10000, max 50000): Max number of tokens to return. Values less than the configuredDEFAULT_MINIMUM_TOKENS
value or the default value of 10000 are automatically increased to that value.
💡 Tips
Add a Rule
If you don't want to add use docfork
to every prompt, you can define a simple rule from your Cursor Settings > Rules
section in Cursor (or the equivalent in your MCP client) to auto-invoke Docfork on any code question:
From then on you'll get Docfork's docs in any related conversation without typing anything extra. You can add your use cases to the match part.
Use Specific Library Names
When you know exactly which library you want to use, be specific in your prompts. This helps Docfork find the right documentation faster and more accurately:
The more specific you are about the library and what you want to accomplish, the better documentation you'll receive.
Development
Clone the project and install dependencies:
Build:
The Docfork MCP server supports the following environment variables:
DEFAULT_MINIMUM_TOKENS
: Set the minimum token count for documentation retrieval (default: 10000)
For HTTP/SSE Transport Only
The following environment variables are only relevant when running the server as an HTTP/SSE service (not for standard npx
usage):
MCP_TRANSPORT
: Set the transport type for MCP communication (default:stdio
, options:streamable-http
,stdio
,sse
)PORT
: Set the port number for HTTP/SSE transport (default:3000
, only used when MCP_TRANSPORT isstreamable-http
orsse
)
Standard node server configuration (most common):
HTTP/SSE server configuration (for custom deployments):
These environment variables are used when you're running your own instance of the Docfork server, not when connecting to remote servers. For remote server connections, use the URL-based configurations shown earlier in this README (e.g., "url": "https://mcp.docfork.com/mcp"
).
If you're self-hosting and want to run the server with HTTP/SSE transport:
🚨 Troubleshooting
If you encounter ERR_MODULE_NOT_FOUND
, try using bunx
instead of npx
:
This often resolves module resolution issues in environments where npx
doesn't properly install or resolve packages.
For errors like Error: Cannot find module 'uriTemplate.js'
, try the --experimental-vm-modules
flag:
Use the --experimental-fetch
flag to bypass TLS-related problems:
- Try adding
@latest
to the package name - Use
bunx
as an alternative tonpx
- Consider using
deno
as another alternative - Ensure you're using Node.js v18 or higher for native fetch support
⚠️ Disclaimer
Docfork is an open, community-driven catalogue. Although we review submissions, we make no warranties—express or implied—about the accuracy, completeness, or security of any linked documentation or code. Projects listed here are created and maintained by their respective authors, not by Docfork.
If you spot content that is suspicious, inappropriate, or potentially harmful, please contact us.
By using Docfork, you agree to do so at your own discretion and risk.
🌟 Let's Connect!
Stay in the loop and meet the community:
Star History
License
MIT
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Tools
Provides up-to-date documentation for 9000+ libraries directly in your AI code editor, enabling accurate code suggestions and eliminating outdated information.
- ❌ The Problem: Expired Knowledge
- ✅ The Solution: @latest docs at warp speed
- 🛠️ Installation
- Using Docfork with Copilot Coding Agent
- 🔨 Available Tools
- 💡 Tips
- Development
- 🚨 Troubleshooting
- ⚠️ Disclaimer
- 🌟 Let's Connect!
- Star History
- License
Related Resources
Related MCP Servers
- -securityAlicense-qualityServes as a guardian of development knowledge, providing AI assistants with curated access to latest documentation and best practices.Last updated -46874TypeScriptMIT License
- -securityAlicense-qualityProvides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.Last updated -12TypeScriptMIT License
- -securityFlicense-qualityEnables AI assistants to search documentation of packages and services to find implementation details, examples, and specifications.Last updated -Python
- -securityFlicense-qualityEnables AI assistants to search for documentation of packages and services, providing implementation details, examples, and specifications through a specialized API.Last updated -32JavaScript