# MCP-RAGNAR - a local RAG MCP Server
A local MCP server that implements RAG (Retrieval-Augmented Generation) with sentence window retrieval.
## Features
- Document indexing with support for multiple file types (txt, md, pdf, doc, docx)
- Sentence window retrieval for better context understanding
- Configurable embedding models (OpenAI or local hugging face mode - i.e BAAI/bge-large-en-v1.5)
- MCP server integration for easy querying
## Requirements
- Python 3.10+
- UV package manager
## Installation
1. Clone the repository:
```bash
git clone <repository-url>
cd mcp-ragnar
```
2. Install dependencies using UV:
```bash
uv pip install -e .
```
## Usage
### Indexing Documents
You can index documents either programmatically or via the command line.
#### Indexing
```bash
python -m indexer.index /path/to/documents /path/to/index
# to change the default local embedding model and chunk size
python -m indexer.index /path/to/documents /path/to/index --chunk-size=512 --embed-model BAAI/bge-small-en-v1.5
# With OpenAI embedding endpoint (put your OPENAI_API_KEY in env)
python -m indexer.index /path/to/documents /path/to/index --embed-endpoint https://api.openai.com/v1 --embed-model text-embedding-3-small --tokenizer-model o200k_base
# Get help
python -m indexer.index --help
```
### Running the MCP Server
### Configuration
can be supplied as env var or .env file
- `EMBED_ENDPOINT`: (Optional) Path to an OpenAI compatible embedding endpoint (ends with /v1). If not set, a local Hugging Face model is used by default.
- `EMBED_MODEL`: (Optional) Name of the embedding model to use. Default value of BAAI/bge-large-en-v1.5.
- `INDEX_ROOT`: The root directory for the index, used by the retriever. This is mandatory for MCP (Multi-Cloud Platform) querying.
- `MCP_DESCRIPTION`: The exposed name and description for the MCP server, used for MCP querying only. This is mandatory for MCP querying. For example: "RAG to my local personal documents"
- `INDEX_ROOT`: the root path of the index
## in SSE mode it will listen to http://localhost:8001/ragnar
```shell
python server/sse.py
```
## in stdio mode
install locally as an uv tool
```shell
uv tool install .
```
### Claude Desktop:
Update the following:
On MacOS: `~/Library/Application\ Support/Claude/claude_desktop_config.json`
On Windows: `%APPDATA%/Claude/claude_desktop_config.json`
Example :
```json
{
"mcpServers": {
"mcp-ragnar": {
"command": "uvx",
"args": [
"mcp-ragnar"
],
"env": {
"OPENAI_API_KEY": "",
"EMBED_ENDPOINT": "https://api.openai.com/v1",
"EMBED_MODEL": "text-embedding-3-small",
"MCP_DESCRIPTION": "My local Rust documentation",
"INDEX_ROOT": "/tmp/index"
}
}
}
}
```
## License
GNU General Public License v3.0