Skip to main content
Glama

MCP-RAG Server

by apatoliya

MCP-RAG: Model Context Protocol with RAG 🚀

A powerful and efficient RAG (Retrieval-Augmented Generation) implementation using GroundX and OpenAI, built with Modern Context Processing (MCP).

🌟 Features

  • Advanced RAG Implementation: Utilizes GroundX for high-accuracy document retrieval
  • Model Context Protocol: Seamless integration with MCP for enhanced context handling
  • Type-Safe: Built with Pydantic for robust type checking and validation
  • Flexible Configuration: Easy-to-customize settings through environment variables
  • Document Ingestion: Support for PDF document ingestion and processing
  • Intelligent Search: Semantic search capabilities with scoring

🛠️ Prerequisites

  • Python 3.12 or higher
  • OpenAI API key
  • GroundX API key
  • MCP CLI tools

📦 Installation

  1. Clone the repository:
git clone <repository-url> cd mcp-rag
  1. Create and activate a virtual environment:
uv sync source .venv/bin/activate # On Windows, use `.venv\Scripts\activate`

⚙️ Configuration

  1. Copy the example environment file:
cp .env.example .env
  1. Configure your environment variables in .env:
GROUNDX_API_KEY="your-groundx-api-key" OPENAI_API_KEY="your-openai-api-key" BUCKET_ID="your-bucket-id"

🚀 Usage

Starting the Server

Run the inspect server using:

mcp dev server.py

Document Ingestion

To ingest new documents:

from server import ingest_documents result = ingest_documents("path/to/your/document.pdf") print(result)

Performing Searches

Basic search query:

from server import process_search_query response = process_search_query("your search query here") print(f"Query: {response.query}") print(f"Score: {response.score}") print(f"Result: {response.result}")

With custom configuration:

from server import process_search_query, SearchConfig config = SearchConfig( completion_model="gpt-4", bucket_id="custom-bucket-id" ) response = process_search_query("your query", config)

📚 Dependencies

  • groundx (≥2.3.0): Core RAG functionality
  • openai (≥1.75.0): OpenAI API integration
  • mcp[cli] (≥1.6.0): Modern Context Processing tools
  • ipykernel (≥6.29.5): Jupyter notebook support

🔒 Security

  • Never commit your .env file containing API keys
  • Use environment variables for all sensitive information
  • Regularly rotate your API keys
  • Monitor API usage for any unauthorized access

🤝 Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request
-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Un servidor que implementa la generación aumentada de recuperación utilizando GroundX y OpenAI, lo que permite la búsqueda semántica y la recuperación de documentos con procesamiento de contexto moderno para un manejo mejorado del contexto.

  1. 🌟 Características
    1. 🛠️ Requisitos previos
      1. 📦 Instalación
        1. ⚙️ Configuración
          1. 🚀 Uso
            1. Iniciando el servidor
            2. Ingestión de documentos
            3. Realizar búsquedas
          2. 📚 Dependencias
            1. 🔒 Seguridad
              1. 🤝 Contribuyendo

                Related MCP Servers

                • -
                  security
                  A
                  license
                  -
                  quality
                  A server that enables document searching using Vertex AI with Gemini grounding, improving search results by grounding responses in private data stored in Vertex AI Datastore.
                  Last updated 9 days ago
                  25
                  Python
                  Apache 2.0
                • -
                  security
                  -
                  license
                  -
                  quality
                  A Retrieval-Augmented Generation server that enables semantic PDF search with OCR capabilities, allowing users to query document content through any MCP client and receive intelligent answers.
                  Last updated 4 months ago
                  1
                  Python
                  Apache 2.0
                • -
                  security
                  F
                  license
                  -
                  quality
                  Implements Retrieval-Augmented Generation (RAG) using GroundX and OpenAI, allowing users to ingest documents and perform semantic searches with advanced context handling through Modern Context Processing (MCP).
                  Last updated 4 months ago
                  4
                  Python
                  • Linux
                  • Apple
                • -
                  security
                  A
                  license
                  -
                  quality
                  A server that integrates Retrieval-Augmented Generation (RAG) with the Model Control Protocol (MCP) to provide web search capabilities and document analysis for AI assistants.
                  Last updated 3 months ago
                  3
                  Python
                  Apache 2.0

                View all related MCP servers

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/apatoliya/mcp-rag'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server