Skip to main content
Glama

RAG MCP Server

by NSANTRA

TITLE

Python LangChain Claude Desktop Cursor IDE ChromaDB HuggingFace GPU License: MIT


TL;DR:

  • This project implements a Retrieval-Augmented Generation (RAG) MCP Server using LangChain wrappers for ChromaDB and Hugging Face models.

  • Designed for seamless integration with Claude Desktop and Cursor IDE as the MCP client.

  • Uses a single persistent Chroma vector database with multiple collections (domains).

  • Automatically retrieves and ranks the most relevant context for Claude, enabling domain-aware reasoning and citation-based responses.


TOC

  • Project Overview

    • Workflow

  • Features

  • Getting Started

    • Prerequisites

    • Installation

  • Integrations

    • Claude Desktop Integration

    • Cursor IDE Integration

  • MCP Inspector

  • Available Tools

  • Project Structure

  • References

  • License


PROJECT OVERVIEW

This project implements a LangChain-powered Retrieval-Augmented Generation (RAG) pipeline hosted as a FastMCP server for integration with Claude Desktop and Cursor IDE.

It uses:

  • langchain_chroma.Chroma for persistent, domain-based vector stores.

  • langchain_huggingface.HuggingFaceEmbeddings for local or HuggingFace embedding models.

  • langchain_community.cross_encoders.HuggingFaceCrossEncoder for local or HuggingFace reranking models for better relevance tracking.

  • FastMCP — a lightweight Python interface (built on FastAPI) that exposes LangChain-based retrieval tools to any MCP client such as Claude Desktop or Cursor IDE.

Each Chroma collection represents a distinct knowledge domain or document. Claude queries are routed to the appropriate collection, which retrieves top-k results and returns relevant context and citations.

⚡Workflow:

flowchart TD Claude[Claude Desktop] MCP[MCP Server: FastMCP + LangChain] LangChain[LangChain Wrappers → ChromaDB + HuggingFace] Claude --> MCP --> LangChain --> Claude

FEATURES

  • PDF Embedding: Add PDFs locally or via URL directly into a chosen collection.

  • Smart Retrieval: Retrieve context chunks per collection or across multiple collections.

  • Reranking Support: Uses a HuggingFace cross-encoder reranker for better document relevance.

  • Document Management: List, rename, and inspect metadata for locally stored documents.

  • Collection Management: Create, list, and delete ChromaDB collections dynamically.

  • Citation Provider: Citations are generated from document metadata (e.g., page numbers, source document and path, etc.).

  • Self-Describing Tools: describeTools() lists all available MCP tools dynamically for introspection.


AVAILABLE TOOLS

This MCP server exposes a set of tools that can be invoked by MCP Client to perform document and collection operations — including embedding, retrieval, metadata management, and citation generation.

For a full list of available tools, their arguments, and example usage, see the dedicated documentation:
View All Tools → TOOLS.md


GETTING STARTED

🔧 Prerequisites

IMPORTANT
  • Ensure Anaconda is installed, if not you can download from Anaconda and also Git (if not available, download from Github).

⚙️ Installation

  1. Create and Activate Conda Environment

conda create -n MCP python=3.11.13 -y conda activate MCP
  1. Clone the Repository

git clone https://github.com/NSANTRA/RAG-MCP-Server.git cd RAG-MCP-Server
  1. Install Dependencies

pip install -r requirements.txt
  1. Configure .env

# Example: # If your system has Nvidia GPU CUDA Toolkit setup, you can set the device to cuda, otherwise set it to cpu DEVICE = "cuda" DOCUMENT_DIR = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Documents" CHROMA_DB_PERSIST_DIR = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Databases" EMBEDDING_MODEL = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Models/MiniLM" RERANKER_MODEL = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Models/MiniLM-Reranker"
CAUTION

You need to mention the absolute path wherever needed.

TIP
  • The above mentioned configuration uses local downloaded models. You can download the models using the Download Model.py python script. Change the models, if needed.

  • You can swap the embedding or reranker paths for any HuggingFace models.


INTEGRATIONS

IMPORTANT

You need to download theClaude Desktop app or Cursor IDE in order to run the MCP Server as it needs a MCP Client. You can download:

The above mentioned MCP clients automatically launches the RAG MCP Server when it’s registered in the MCP configuration file.
You do not need to run the Python script manually.

Claude Desktop Integration

🛠️ Setup Instructions

  • Add the following entry to your Claude MCP configuration file (typically located in your Claude Desktop settings folder).

  • You can find the mcp configuration file here: Settings → Developer → Edit Config to open the file.

  • Then, add the following JSON config:

{ "mcpServers": { "RAG": { "command": "C:/Users/<yourusername>/anaconda3/envs/MCP/python.exe", "args": ["<absolute to the Main.py>"], "options": { "cwd": "absolute project root directory path" } } } }

⚠️ Common Issue: If Claude fails to start the MCP server, ensure that:

  • The Python path points to your Conda environment’s executable.

  • Main.py has no syntax errors and dependencies are installed.

  • The cwd option matches your project root directory.

Cursor IDE Integration

🛠️ Setup Instructions

  • Open your project in Cursor IDE and go to File → Preferences → Cursor Setting → Tool & MCP → New MCP Server to open your MCP configuration file.

  • Add the following JSON entry under the "mcpServers" section (adjusting paths as needed):

{ "mcpServers": { "RAG": { "command": "C:/Users/<yourusername>/anaconda3/envs/MCP/python.exe", "args": ["<absolute to the Main.py>"], "options": { "cwd": "absolute project root directory path" } } } }

MCP INSPECTOR

TIP

MCP Inspector is an official developer tool from Anthropic that lets you test, debug, and inspect any Model Context Protocol (MCP) server — including custom RAG MCP servers — without requiring Claude Desktop or Cursor IDE.

IMPORTANT
  • To use MCP Inspector, you must have Node.js installed.

  • During installation, enable “Add to PATH.”

  • Verify your installation with node -v, npm -v and npx -v.

What It Does

  • Lets you call tools interactively and see raw JSON input/output.

  • Displays system logs, server metadata, and protocol messages.

  • Ideal for testing new tool definitions or debugging retrieval workflows.

Installation

You can install MCP Inspector globally using npm:

npm install -g @modelcontextprotocol/inspector

Or run directly with npx (no install needed):

npx @modelcontextprotocol/inspector

Usage

  1. Navigate to your project root directory where Main.py is located.

  2. Launch your MCP server via the Inspector:

npx @modelcontextprotocol/inspector python Main.py
TIP

(If using a Conda environment, replace python with its full path. Or, first activate the environment, and use the above command as it is.)

  1. The Inspector will open a local web interface (usually at http://localhost:6274 ) showing:

    • Input/output schemas

    • Real-time logs and response traces


PROJECT STRUCTURE

├── Main.py # Entry point - starts the FastMCP server │ ├── Modules/ │ ├── Config.py # Loads env vars, sets up embeddings & reranker │ ├── Core.py # Document-level utilities (metadata, citation, rename) │ ├── Database.py # ChromaDB logic for embedding/retrieval │ ├── Utils.py # Helper functions (file ops, reranking) │ └── ToolDefinition.py # MCP tool manifests and argument schemas │ ├── .env # Environment configuration ├── requirements.txt # Dependencies └── README.md

REFERENCES

  1. LangChain RAG Workflow LangChain Documentation — RAG

  2. Chroma Vector Database Chroma Docs

  3. HuggingFace Embeddings and Cross-Encoders Sentence Transformers Cross-Encoder Models

  4. Anthropic MCP & Claude Desktop Model Context Protocol Official Site Claude Desktop Overview


LICENSE

MIT License

Copyright (c) 2025 Neelotpal Santra

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

-
security - not tested
A
license - permissive license
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

Enables Claude to perform retrieval-augmented generation using LangChain, ChromaDB, and HuggingFace models for domain-aware reasoning with PDF embedding, smart retrieval, reranking, and citation-based responses.

  1. 🔧 Prerequisites
    1. ⚙️ Installation
      1. Claude Desktop Integration
        1. 🛠️ Setup Instructions
      2. Cursor IDE Integration
        1. 🛠️ Setup Instructions
        2. What It Does
        3. Installation
        4. Usage

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/NSANTRA/RAG-MCP-Server'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server