MCP-Ragdocs

  • Search
  • Databases
JavaScript
MIT
326
25
  • Apple
-
security - not tested
A
license - permissive license (MIT)
-
quality - not tested

A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.

  1. Tools
  2. Prompts
  3. Resources
  4. Server Configuration
  5. README.md

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OLLAMA_URLNoURL of your Ollama instance, defaults to http://localhost:11434.http://localhost:11434
QDRANT_URLYesURL of your Qdrant instance. For local use: http://localhost:6333. For Qdrant Cloud: https://your-cluster-url.qdrant.tech
OPENAI_API_KEYNoYour OpenAI API key, required if using OpenAI as the embedding provider.
QDRANT_API_KEYNoYour Qdrant Cloud API key, required if using Qdrant Cloud.
EMBEDDING_MODELNoOptional embedding model. Defaults to 'nomic-embed-text' for Ollama and 'text-embedding-3-small' for OpenAI.
EMBEDDING_PROVIDERNoChoose between 'ollama' (default) or 'openai' for the embedding provider.ollama
README.md

MCP-Ragdocs

A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.

Version

Current version: 0.1.6

Features

  • Add documentation from URLs or local files
  • Store documentation in a vector database for semantic search
  • Search through documentation using natural language
  • List all documentation sources

Installation

Install globally using npm:

npm install -g @qpd-v/mcp-server-ragdocs

This will install the server in your global npm directory, which you'll need for the configuration steps below.

Requirements

  • Node.js 16 or higher
  • Qdrant (either local or cloud)
  • One of the following for embeddings:
    • Ollama running locally (default, free)
    • OpenAI API key (optional, paid)

Qdrant Setup Options

Option 1: Local Qdrant

  1. Using Docker (recommended):
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
  1. Or download from Qdrant's website

Option 2: Qdrant Cloud

  1. Create an account at Qdrant Cloud
  2. Create a new cluster
  3. Get your cluster URL and API key from the dashboard
  4. Use these in your configuration (see Configuration section below)

Configuration

The server can be used with both Cline and Claude Desktop. Configuration differs slightly between them:

Cline Configuration

Add to your Cline settings file (%AppData%\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json):

  1. Using npm global install (recommended):
{ "mcpServers": { "ragdocs": { "command": "node", "args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }

For OpenAI instead of Ollama:

{ "mcpServers": { "ragdocs": { "command": "node", "args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "openai", "OPENAI_API_KEY": "your-openai-api-key" } } } }
  1. Using local development setup:
{ "mcpServers": { "ragdocs": { "command": "node", "args": ["PATH_TO_PROJECT/mcp-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }

Claude Desktop Configuration

Add to your Claude Desktop config file:

  • Windows: %AppData%\Claude\claude_desktop_config.json
  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  1. Windows Setup with Ollama (using full paths):
{ "mcpServers": { "ragdocs": { "command": "C:\\Program Files\\nodejs\\node.exe", "args": [ "C:\\Users\\YOUR_USERNAME\\AppData\\Roaming\\npm\\node_modules\\@qpd-v/mcp-server-ragdocs\\build\\index.js" ], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }

Windows Setup with OpenAI:

{ "mcpServers": { "ragdocs": { "command": "C:\\Program Files\\nodejs\\node.exe", "args": [ "C:\\Users\\YOUR_USERNAME\\AppData\\Roaming\\npm\\node_modules\\@qpd-v/mcp-server-ragdocs\\build\\index.js" ], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "openai", "OPENAI_API_KEY": "your-openai-api-key" } } } }
  1. macOS Setup with Ollama:
{ "mcpServers": { "ragdocs": { "command": "/usr/local/bin/node", "args": [ "/usr/local/lib/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js" ], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }

Qdrant Cloud Configuration

For either Cline or Claude Desktop, when using Qdrant Cloud, modify the env section:

With Ollama:

{ "env": { "QDRANT_URL": "https://your-cluster-url.qdrant.tech", "QDRANT_API_KEY": "your-qdrant-api-key", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } }

With OpenAI:

{ "env": { "QDRANT_URL": "https://your-cluster-url.qdrant.tech", "QDRANT_API_KEY": "your-qdrant-api-key", "EMBEDDING_PROVIDER": "openai", "OPENAI_API_KEY": "your-openai-api-key" } }

Environment Variables

Qdrant Configuration

Embeddings Configuration

  • EMBEDDING_PROVIDER (optional): Choose between 'ollama' (default) or 'openai'
  • EMBEDDING_MODEL (optional):
    • For Ollama: defaults to 'nomic-embed-text'
    • For OpenAI: defaults to 'text-embedding-3-small'
  • OLLAMA_URL (optional): URL of your Ollama instance (defaults to http://localhost:11434)
  • OPENAI_API_KEY (required if using OpenAI): Your OpenAI API key

Available Tools

  1. add_documentation
    • Add documentation from a URL to the RAG database
    • Parameters:
      • url: URL of the documentation to fetch
  2. search_documentation
    • Search through stored documentation
    • Parameters:
      • query: Search query
      • limit (optional): Maximum number of results to return (default: 5)
  3. list_sources
    • List all documentation sources currently stored
    • No parameters required

Example Usage

In Claude Desktop or any other MCP-compatible client:

  1. Add documentation:
Add this documentation: https://docs.example.com/api
  1. Search documentation:
Search the documentation for information about authentication
  1. List sources:
What documentation sources are available?

Development

  1. Clone the repository:
git clone https://github.com/qpd-v/mcp-server-ragdocs.git cd mcp-server-ragdocs
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Run locally:
npm start

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

GitHub Badge

Glama performs regular codebase and documentation scans to:

  • Confirm that the MCP server is working as expected.
  • Confirm that there are no obvious security issues with dependencies of the server.
  • Extract server characteristics such as tools, resources, prompts, and required parameters.

Our directory badge helps users to quickly asses that the MCP server is safe, server capabilities, and instructions for installing the server.

Copy the following code to your README.md file:

Alternative MCP servers

  • -
    security
    A
    license
    -
    quality
    This server provides: * Fast file search capabilities using Everything SDK * Windows-specific implementation * Complements existing filesystem servers with specialized search functionality
    MIT
  • A
    security
    A
    license
    A
    quality
    Give Claude access to real-time knowledge and premium content. Get rid of Claude's cutoff data and transform Claude's responses with current events, and trusted, premium sources through Linkup's powerful search capability.
    MIT
  • -
    security
    A
    license
    -
    quality
    A Model Context Protocol (MCP) server for web research. Bring real-time info into Claude and easily research any topic.
    MIT
    • Apple
  • A
    security
    A
    license
    A
    quality
    Implementation of an MCP server for the RAG Web Browser Actor. This Actor serves as a web browser for large language models (LLMs) and RAG pipelines, similar to a web search in ChatGPT.
    Apache-2.0
    • Apple