Skip to main content
Glama

🧠 MCP AI Server — Modular Context Protocol for Intelligent Search

Welcome to the MCP AI Server, a powerful and modular tool that uses RAG-based retrieval, Pinecone vector storage, and MCP (Model Context Protocol) to create intelligent assistants capable of answering domain-specific questions from your own knowledge base.

MCP + Claude + Pinecone Python License: MIT


šŸš€ Features

āœ… Local MCP server with FastAPI + Claude/ChatGPT integration
āœ… Embedding using intfloat/multilingual-e5-large (via SentenceTransformer)
āœ… Fast vector search with Pinecone
āœ… Documented tools exposed to clients like Claude and Cursor IDE
āœ… Secure .env usage for managing API keys
āœ… Clean, extensible architecture


Related MCP server: Model Context Protocol Server

šŸ”§ Setup Instructions

1. Clone the Repo

git clone git@github.com:MeetRathodNitsan/MCP1.git cd MCP1

2. Create a Virtual Environment

python -m venv .venv # Windows .venv\Scripts\activate # macOS/Linux source .venv/bin/activate

3. Install Dependencies

pip install -r requirements.txt

4. Configure Environment Variables

OPENAI_API_KEY=your-api-key... PINECONE_API_KEY=... PINECONE_ENVIRONMENT=your-env

5. How to use it

uv --directory F:/Project run main.py
-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/MeetRathodNitsan/MCP1'

If you have feedback or need assistance with the MCP directory API, please join our Discord server