Skip to main content
Glama
ashishpatel26

Model Context Protocol Server

πŸš€ Agentic RAG with MCP Server Agentic-RAG-MCPServer - AgenticRag


✨ Overview

Agentic RAG with MCP Server is a powerful project that brings together an MCP (Model Context Protocol) server and client for building Agentic RAG (Retrieval-Augmented Generation) applications.

This setup empowers your RAG system with advanced tools such as:

  • πŸ•΅οΈβ€β™‚οΈ Entity Extraction

  • πŸ” Query Refinement

  • βœ… Relevance Checking

The server hosts these intelligent tools, while the client shows how to seamlessly connect and utilize them.


Related MCP server: Shared Knowledge MCP Server

πŸ–₯️ Server β€” server.py

Powered by the FastMCP class from the mcp library, the server exposes these handy tools:

Tool Name

Description

Icon

get_time_with_prefix

Returns the

current date & time

⏰

extract_entities_tool

Uses

OpenAI

to extract entities from a query β€” enhancing document retrieval relevance

🧠

refine_query_tool

Improves the quality of user queries with

OpenAI-powered refinement

✨

check_relevance

Filters out irrelevant content by checking chunk relevance with an LLM

βœ…


🀝 Client β€” mcp-client.py

The client demonstrates how to connect and interact with the MCP server:

  • Establish a connection with ClientSession from the mcp library

  • List all available server tools

  • Call any tool with custom arguments

  • Process queries leveraging OpenAI or Gemini and MCP tools in tandem


βš™οΈ Requirements

  • Python 3.9 or higher

  • openai Python package

  • mcp library

  • python-dotenv for environment variable management


πŸ› οΈ Installation Guide

# Step 1: Clone the repository git clone https://github.com/ashishpatel26/Agentic-RAG-with-MCP-Server.git # Step 2: Navigate into the project directory cd Agentic-RAG-with-MCP-Serve # Step 3: Install dependencies pip install -r requirements.txt

πŸ” Configuration

  1. Create a .env file (use .env.sample as a template)

  2. Set your OpenAI model in .env:

OPENAI_MODEL_NAME="your-model-name-here" GEMINI_API_KEY="your-model-name-here"

πŸš€ How to Use

  1. Start the MCP server:

python server.py
  1. Run the MCP client:

python mcp-client.py

πŸ“œ License

This project is licensed under the MIT License.


Thanks for Reading πŸ™

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ashishpatel26/Agentic-RAG-with-MCP-Server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server