MCP RAG Server
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP RAG Serversearch my local documentation for the API authentication flow"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP RAG Server
A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) functionality using local embeddings via Ollama and Chroma vector database.
Features
Local Processing: No external API costs - runs entirely locally
Multiple Formats: Supports PDF, Markdown, and TXT files
Smart Chunking: Configurable chunk size with overlap for better context
Vector Search: Semantic search using nomic-embed-text model via Ollama
MCP Integration: Works seamlessly with Cursor and other MCP clients
Prerequisites
Node.js (v18 or higher)
Docker (for ChromaDB)
Homebrew (for Ollama on macOS)
🚀 Quick Start
Setup (one time)
npm run setupThis will:
Start Ollama and install nomic-embed-text model
Start ChromaDB with Docker
Build the project
Ingest documents from
./docs
Development
# Start MCP server
npm run dev
# Ingest new documents
npm run ingestStop Services
npm run stopConfiguration
The server uses a config.json file for configuration:
{
"documentsPath": "./docs",
"chunkSize": 1000,
"chunkOverlap": 200,
"ollamaUrl": "http://localhost:11434",
"embeddingModel": "nomic-embed-text",
"chromaUrl": "http://localhost:8001",
"collectionName": "rag_documents",
"mcpServer": {
"name": "mcp-rag-server",
"version": "1.0.0"
}
}MCP Tools
ingest_docs({path?})- Ingest documents from a directorysearch({query, k?})- Search for relevant document chunksget_chunk({id})- Retrieve a specific chunk by IDrefresh_index()- Clear and refresh the entire index
MCP Resources
rag://collection/summary- Collection statistics and metadatarag://doc/<filename>#<chunk_id>- Individual document chunks
Configure in Cursor
Add to your Cursor MCP settings:
{
"mcpServers": {
"rag-server": {
"command": "node",
"args": ["/Users/luizsoares/Documents/buildaz/mcp_rag/dist/index.js"],
"env": {}
}
}
}Available Scripts
npm run setup- Complete setup (Ollama + ChromaDB + build + ingest)npm run dev- Start MCP server in development modenpm run ingest- Ingest documentsnpm run build- Build the projectnpm run test- Run testsnpm run stop- Stop all services
Troubleshooting
Ollama Connection Issues: Ensure Ollama is running on the configured URL
Model Not Found: Run
ollama pull nomic-embed-textto install the embedding modelDocker Issues: Ensure Docker is running and accessible
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/LuizDoPc/mcp-rag'
If you have feedback or need assistance with the MCP directory API, please join our Discord server