MCP-Ragdocs
- Search
- Databases
A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |
Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
OLLAMA_URL | No | URL of your Ollama instance, defaults to http://localhost:11434. | http://localhost:11434 |
QDRANT_URL | Yes | URL of your Qdrant instance. For local use: http://localhost:6333. For Qdrant Cloud: https://your-cluster-url.qdrant.tech | |
OPENAI_API_KEY | No | Your OpenAI API key, required if using OpenAI as the embedding provider. | |
QDRANT_API_KEY | No | Your Qdrant Cloud API key, required if using Qdrant Cloud. | |
EMBEDDING_MODEL | No | Optional embedding model. Defaults to 'nomic-embed-text' for Ollama and 'text-embedding-3-small' for OpenAI. | |
EMBEDDING_PROVIDER | No | Choose between 'ollama' (default) or 'openai' for the embedding provider. | ollama |
MCP-Ragdocs
A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.
Version
Current version: 0.1.6
Features
- Add documentation from URLs or local files
- Store documentation in a vector database for semantic search
- Search through documentation using natural language
- List all documentation sources
Installation
Install globally using npm:
This will install the server in your global npm directory, which you'll need for the configuration steps below.
Requirements
- Node.js 16 or higher
- Qdrant (either local or cloud)
- One of the following for embeddings:
- Ollama running locally (default, free)
- OpenAI API key (optional, paid)
Qdrant Setup Options
Option 1: Local Qdrant
- Using Docker (recommended):
- Or download from Qdrant's website
Option 2: Qdrant Cloud
- Create an account at Qdrant Cloud
- Create a new cluster
- Get your cluster URL and API key from the dashboard
- Use these in your configuration (see Configuration section below)
Configuration
The server can be used with both Cline and Claude Desktop. Configuration differs slightly between them:
Cline Configuration
Add to your Cline settings file (%AppData%\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json
):
- Using npm global install (recommended):
For OpenAI instead of Ollama:
- Using local development setup:
Claude Desktop Configuration
Add to your Claude Desktop config file:
- Windows:
%AppData%\Claude\claude_desktop_config.json
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows Setup with Ollama (using full paths):
Windows Setup with OpenAI:
- macOS Setup with Ollama:
Qdrant Cloud Configuration
For either Cline or Claude Desktop, when using Qdrant Cloud, modify the env section:
With Ollama:
With OpenAI:
Environment Variables
Qdrant Configuration
QDRANT_URL
(required): URL of your Qdrant instance- For local: http://localhost:6333
- For cloud: https://your-cluster-url.qdrant.tech
QDRANT_API_KEY
(required for cloud): Your Qdrant Cloud API key
Embeddings Configuration
EMBEDDING_PROVIDER
(optional): Choose between 'ollama' (default) or 'openai'EMBEDDING_MODEL
(optional):- For Ollama: defaults to 'nomic-embed-text'
- For OpenAI: defaults to 'text-embedding-3-small'
OLLAMA_URL
(optional): URL of your Ollama instance (defaults to http://localhost:11434)OPENAI_API_KEY
(required if using OpenAI): Your OpenAI API key
Available Tools
add_documentation
- Add documentation from a URL to the RAG database
- Parameters:
url
: URL of the documentation to fetch
search_documentation
- Search through stored documentation
- Parameters:
query
: Search querylimit
(optional): Maximum number of results to return (default: 5)
list_sources
- List all documentation sources currently stored
- No parameters required
Example Usage
In Claude Desktop or any other MCP-compatible client:
- Add documentation:
- Search documentation:
- List sources:
Development
- Clone the repository:
- Install dependencies:
- Build the project:
- Run locally:
License
MIT
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
GitHub Badge
Glama performs regular codebase and documentation scans to:
- Confirm that the MCP server is working as expected.
- Confirm that there are no obvious security issues with dependencies of the server.
- Extract server characteristics such as tools, resources, prompts, and required parameters.
Our directory badge helps users to quickly asses that the MCP server is safe, server capabilities, and instructions for installing the server.
Copy the following code to your README.md file: