Skip to main content
Glama

🧠 MCP AI Server β€” Modular Context Protocol for Intelligent Search

Welcome to the MCP AI Server, a powerful and modular tool that uses RAG-based retrieval, Pinecone vector storage, and MCP (Model Context Protocol) to create intelligent assistants capable of answering domain-specific questions from your own knowledge base.

MCP + Claude + Pinecone Python License: MIT


πŸš€ Features

βœ… Local MCP server with FastAPI + Claude/ChatGPT integration
βœ… Embedding using intfloat/multilingual-e5-large (via SentenceTransformer)
βœ… Fast vector search with Pinecone
βœ… Documented tools exposed to clients like Claude and Cursor IDE
βœ… Secure .env usage for managing API keys
βœ… Clean, extensible architecture


Related MCP server: Model Context Protocol Server

πŸ”§ Setup Instructions

1. Clone the Repo

git clone git@github.com:MeetRathodNitsan/MCP1.git cd MCP1

2. Create a Virtual Environment

python -m venv .venv # Windows .venv\Scripts\activate # macOS/Linux source .venv/bin/activate

3. Install Dependencies

pip install -r requirements.txt

4. Configure Environment Variables

OPENAI_API_KEY=your-api-key... PINECONE_API_KEY=... PINECONE_ENVIRONMENT=your-env

5. How to use it

uv --directory F:/Project run main.py
Install Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/MeetRathodNitsan/MCP1'

If you have feedback or need assistance with the MCP directory API, please join our Discord server