Skip to main content
Glama

MCP-RAGNAR - a local RAG MCP Server

A local MCP server that implements RAG (Retrieval-Augmented Generation) with sentence window retrieval.

Features

  • Document indexing with support for multiple file types (txt, md, pdf, doc, docx)

  • Sentence window retrieval for better context understanding

  • Configurable embedding models (OpenAI or local hugging face mode - i.e BAAI/bge-large-en-v1.5)

  • MCP server integration for easy querying

Requirements

  • Python 3.10+

  • UV package manager

Installation

  1. Clone the repository:

git clone <repository-url>
cd mcp-ragnar
  1. Install dependencies using UV:

uv pip install -e .

Usage

Indexing Documents

You can index documents either programmatically or via the command line.

Indexing

python -m indexer.index /path/to/documents /path/to/index

# to change the default local embedding model and chunk size
python -m indexer.index /path/to/documents /path/to/index --chunk-size=512 --embed-model BAAI/bge-small-en-v1.5

# With OpenAI embedding endpoint (put your OPENAI_API_KEY in env)
python -m indexer.index /path/to/documents /path/to/index --embed-endpoint https://api.openai.com/v1 --embed-model text-embedding-3-small --tokenizer-model o200k_base

# Get help
python -m indexer.index --help

Running the MCP Server

Configuration

can be supplied as env var or .env file

  • EMBED_ENDPOINT: (Optional) Path to an OpenAI compatible embedding endpoint (ends with /v1). If not set, a local Hugging Face model is used by default.

  • EMBED_MODEL: (Optional) Name of the embedding model to use. Default value of BAAI/bge-large-en-v1.5.

  • INDEX_ROOT: The root directory for the index, used by the retriever. This is mandatory for MCP (Multi-Cloud Platform) querying.

  • MCP_DESCRIPTION: The exposed name and description for the MCP server, used for MCP querying only. This is mandatory for MCP querying. For example: "RAG to my local personal documents"

  • INDEX_ROOT: the root path of the index

in SSE mode it will listen to http://localhost:8001/ragnar

python server/sse.py

in stdio mode

install locally as an uv tool

uv tool install .

Claude Desktop:

Update the following:

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

Example :

{
  "mcpServers": {
    "mcp-ragnar": {
      "command": "uvx",
      "args": [
        "mcp-ragnar"
      ],
      "env": {
        "OPENAI_API_KEY": "",
        "EMBED_ENDPOINT": "https://api.openai.com/v1",
        "EMBED_MODEL": "text-embedding-3-small",
        "MCP_DESCRIPTION": "My local Rust documentation",
        "INDEX_ROOT": "/tmp/index"
      }
    }
  }
}

License

GNU General Public License v3.0

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Looking for Admin?

Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bixentemal/mcp-ragnar'

If you have feedback or need assistance with the MCP directory API, please join our Discord server