remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Integrations
Context7 MCP - Up-to-date Docs For Any Cursor Prompt
❌ Without Context7
LLMs rely on outdated or generic information about the libraries you use. You get:
- ❌ Code examples are outdated and based on year-old training data
- ❌ Hallucinated APIs don't even exist
- ❌ Generic answers for old package versions
✅ With Context7
Context7 MCP pulls up-to-date, version-specific documentation and code examples straight from the source — and places them directly into your prompt.
Add use context7
to your question in Cursor:
Context7 fetches up-to-date documentation and working code examples right into your LLM’s context.
- 1️⃣ Ask your question naturally
- 2️⃣ Tell the LLM to
use context7
- 3️⃣ Get working code answers
No tab-switching, no hallucinated APIs that don't exist, no outdated code generations.
🛠️ Getting Started
Requirements
- Node.js >= v18.0.0
- Cursor, Windsurf, Claude Desktop or another MCP Client
Install in Cursor
Go to: Settings
-> Cursor Settings
-> MCP
-> Add new global MCP server
Paste this into your Cursor ~/.cursor/mcp.json
file. See Cursor MCP docs for more info.
Install in Windsurf
Add this to your Windsurf MCP config file. See Windsurf MCP docs for more info.
Install in VSCode
Add this to your VSCode MCP config file. See VSCode MCP docs for more info.
Available Tools
resolve-library-id
: Resolves a general library name into a Context7-compatible library ID.libraryName
(optional): Search and rerank results
get-library-docs
: Fetches documentation for a library using a Context7-compatible library ID.context7CompatibleLibraryID
(required)topic
(optional): Focus the docs on a specific topic (e.g., "routing", "hooks")tokens
(optional, default 5000): Max number of tokens to return
Development
Clone the project and install dependencies:
Build:
Local Configuration Example
Testing with MCP Inspector
License
MIT
You must be authenticated.
A Model Context Protocol server that fetches up-to-date, version-specific documentation and code examples from libraries directly into LLM prompts, helping developers get accurate answers without outdated or hallucinated information.