Uses HuggingFace models for embeddings and cross-encoder reranking to enable document retrieval and relevance scoring in the RAG pipeline.
Built on LangChain wrappers for ChromaDB vector stores, embeddings, and cross-encoders to provide document embedding, retrieval, and reranking capabilities with domain-aware collections.
TL;DR:
This project implements a Retrieval-Augmented Generation (RAG) MCP Server using LangChain wrappers for ChromaDB and Hugging Face models.
Designed for seamless integration with Claude Desktop and Cursor IDE as the MCP client.
Uses a single persistent Chroma vector database with multiple collections (domains).
Automatically retrieves and ranks the most relevant context for Claude, enabling domain-aware reasoning and citation-based responses.
Project Overview
Workflow
Features
Getting Started
Prerequisites
Installation
Integrations
Claude Desktop Integration
Cursor IDE Integration
MCP Inspector
Available Tools
Project Structure
References
License
This project implements a LangChain-powered Retrieval-Augmented Generation (RAG) pipeline hosted as a FastMCP server for integration with Claude Desktop and Cursor IDE.
It uses:
langchain_chroma.Chromafor persistent, domain-based vector stores.langchain_huggingface.HuggingFaceEmbeddingsfor local or HuggingFace embedding models.langchain_community.cross_encoders.HuggingFaceCrossEncoderfor local or HuggingFace reranking models for better relevance tracking.FastMCP— a lightweight Python interface (built on FastAPI) that exposes LangChain-based retrieval tools to any MCP client such as Claude Desktop or Cursor IDE.
Each Chroma collection represents a distinct knowledge domain or document. Claude queries are routed to the appropriate collection, which retrieves top-k results and returns relevant context and citations.
⚡Workflow:
PDF Embedding: Add PDFs locally or via URL directly into a chosen collection.
Smart Retrieval: Retrieve context chunks per collection or across multiple collections.
Reranking Support: Uses a HuggingFace cross-encoder reranker for better document relevance.
Document Management: List, rename, and inspect metadata for locally stored documents.
Collection Management: Create, list, and delete ChromaDB collections dynamically.
Citation Provider: Citations are generated from document metadata (e.g., page numbers, source document and path, etc.).
Self-Describing Tools:
describeTools()lists all available MCP tools dynamically for introspection.
This MCP server exposes a set of tools that can be invoked by MCP Client to perform document and collection operations — including embedding, retrieval, metadata management, and citation generation.
For a full list of available tools, their arguments, and example usage, see the dedicated documentation:
View All Tools → TOOLS.md
🔧 Prerequisites
⚙️ Installation
Create and Activate Conda Environment
Clone the Repository
Install Dependencies
Configure .env
You need to mention the absolute path wherever needed.
The above mentioned configuration uses local downloaded models. You can download the models using the Download Model.py python script. Change the models, if needed.
You can swap the embedding or reranker paths for any HuggingFace models.
You need to download theClaude Desktop app or Cursor IDE in order to run the MCP Server as it needs a MCP Client. You can download:
The above mentioned MCP clients automatically launches the RAG MCP Server when it’s registered in the MCP configuration file.
You do not need to run the Python script manually.
Claude Desktop Integration
🛠️ Setup Instructions
Add the following entry to your Claude MCP configuration file (typically located in your Claude Desktop settings folder).
You can find the mcp configuration file here: Settings → Developer → Edit Config to open the file.
Then, add the following JSON config:
⚠️ Common Issue: If Claude fails to start the MCP server, ensure that:
The Python path points to your Conda environment’s executable.
Main.pyhas no syntax errors and dependencies are installed.The
cwdoption matches your project root directory.
Cursor IDE Integration
🛠️ Setup Instructions
Open your project in Cursor IDE and go to File → Preferences → Cursor Setting → Tool & MCP → New MCP Server to open your MCP configuration file.
Add the following JSON entry under the "mcpServers" section (adjusting paths as needed):
MCP Inspector is an official developer tool from Anthropic that lets you test, debug, and inspect any Model Context Protocol (MCP) server — including custom RAG MCP servers — without requiring Claude Desktop or Cursor IDE.
To use MCP Inspector, you must have Node.js installed.
During installation, enable “Add to PATH.”
Verify your installation with
node -v,npm -vandnpx -v.
What It Does
Lets you call tools interactively and see raw JSON input/output.
Displays system logs, server metadata, and protocol messages.
Ideal for testing new tool definitions or debugging retrieval workflows.
Installation
You can install MCP Inspector globally using npm:
Or run directly with npx (no install needed):
Usage
Navigate to your project root directory where Main.py is located.
Launch your MCP server via the Inspector:
(If using a Conda environment, replace python with its full path. Or, first activate the environment, and use the above command as it is.)
The Inspector will open a local web interface (usually at http://localhost:6274 ) showing:
Input/output schemas
Real-time logs and response traces
LangChain RAG Workflow LangChain Documentation — RAG
Chroma Vector Database Chroma Docs
HuggingFace Embeddings and Cross-Encoders Sentence Transformers Cross-Encoder Models
Anthropic MCP & Claude Desktop Model Context Protocol Official Site Claude Desktop Overview
MIT License
Copyright (c) 2025 Neelotpal Santra
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Enables Claude to perform retrieval-augmented generation using LangChain, ChromaDB, and HuggingFace models for domain-aware reasoning with PDF embedding, smart retrieval, reranking, and citation-based responses.