Provides local LLM support for the RAG workflow, allowing the server to run inference locally using models like llama2 pulled through Ollama
Handles dependency management for the project, providing streamlined installation of required packages for the MCP server
Serves as the runtime environment for the MCP server implementation, requiring version 3.12+ to run the RAG workflow and document processing
RAG-MCP Server
A general-purpose Retrieval-Augmented Generation (RAG) server using the Model Control Protocol (MCP), designed to be tested with RISC Zero's Bonsai documentation.
Overview
This project implements a RAG server that:
- Uses MCP (Model Control Protocol) for standardized communication
- Implements RAG (Retrieval-Augmented Generation) workflow for document querying
- Can be tested with RISC Zero's Bonsai documentation
- Supports local LLM integration through Ollama
Features
- Document ingestion and indexing
- Semantic search capabilities
- Local LLM integration
- MCP protocol compliance
- RISC Zero Bonsai documentation support
Prerequisites
- Python 3.12+
- Ollama (for local LLM support)
- Poetry (for dependency management)
Installation
- Install Python dependencies:
- Install and start Ollama:
- Pull the required model:
Usage
- Start the MCP server:
- The server will:
- Initialize the LLM and embedding model
- Ingest documents from the data directory
- Process queries using the RAG workflow
- Test with RISC Zero Bonsai docs:
- Place RISC Zero Bonsai documentation in the
data/
directory - Query the server about Bonsai features and implementation
- Place RISC Zero Bonsai documentation in the
Project Structure
mcp_server.py
: Main server implementationrag.py
: RAG workflow implementationdata/
: Directory for document ingestionstorage/
: Vector store and document storagestart_ollama.sh
: Script to start Ollama service
Testing with RISC Zero Bonsai
The server is configured to work with RISC Zero's Bonsai documentation. You can:
- Add Bonsai documentation to the
data/
directory - Query about Bonsai features, implementation details, and usage
- Test the RAG workflow with Bonsai-specific questions
Made with ❤️ by proofofsid
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Implements a RAG workflow that integrates with any custom knowledge base and can be triggered directly from the Cursor IDE.
Related MCP Servers
- -securityAlicense-qualityA Cursor-compatible toolkit that provides intelligent coding assistance through custom AI tools for code architecture planning, screenshot analysis, code review, and file reading capabilities.Last updated -73810TypeScriptMIT License
- -securityAlicense-qualityA collection of Laravel helper tools for integration with Cursor IDE, providing features like log viewing, error searching, artisan command execution, and model information display directly within the editor.Last updated -12PythonMIT License
- -securityFlicense-qualityThis server enables AI assistants (CLINE, Cursor, Windsurf, Claude Desktop) to share a common knowledge base through Retrieval Augmented Generation (RAG), providing consistent information access across multiple tools.Last updated -3TypeScript
- -securityFlicense-qualityImplements a MANUS-inspired development workflow for RBAC dashboard applications with browser automation capabilities, designed to be accessed by Cursor IDE's MCP integration.Last updated -Python