Skip to main content
Glama
README.mdβ€’6.11 kB
# πŸ€– Gemini MCP Server A powerful AI Customer Support Bot powered by **Google Gemini** and following the **Model Context Protocol (MCP)** standard. ![License](https://img.shields.io/badge/license-MIT-blue.svg) ![Python](https://img.shields.io/badge/python-3.9+-green.svg) ![FastAPI](https://img.shields.io/badge/FastAPI-0.104+-red.svg) ![Gemini](https://img.shields.io/badge/AI-Google%20Gemini-blue.svg) ## πŸš€ Quick Deploy to Hugging Face Spaces This backend is configured for deployment on Hugging Face Spaces. The deployment uses the `app.py` file as the main entry point. ### Environment Variables Required: - `GEMINI_API_KEY`: Your Google Gemini API key - `DATABASE_URL`: Database connection string (optional, defaults to SQLite) ### Deployment Steps: 1. Fork this repository to your GitHub account 2. Go to [Hugging Face Spaces](https://huggingface.co/spaces) 3. Click "Create new Space" 4. Choose "Docker" as the SDK 5. Connect your GitHub repository 6. Set the required environment variables in the Space settings 7. Deploy! ## ✨ Features - πŸ€– **Google Gemini AI** - Fast, free, and powerful AI responses - πŸ”„ **MCP Protocol** - Full Model Context Protocol implementation - ⚑ **High Performance** - Async/await, concurrent batch processing - πŸ›‘οΈ **Rate Limiting** - Built-in protection against abuse - πŸ“Š **Database Integration** - Store conversations and analytics - 🌍 **Multi-language** - Support for 9+ languages - πŸš€ **Production Ready** - Docker, health checks, monitoring - πŸ’° **100% FREE** - No API costs, generous limits ## πŸš€ Quick Start ### 1. Get FREE Gemini API Key 1. Visit [Google AI Studio](https://aistudio.google.com/app/apikey) 2. Sign in with your Google account 3. Click "Create API Key" 4. Copy your API key ### 2. Clone & Setup ```bash git clone https://github.com/your-username/gemini-mcp-server.git cd gemini-mcp-server # Install dependencies pip install -r requirements.txt # Copy environment file cp env.example .env # Add your Gemini API key to .env GEMINI_API_KEY=your_key_here ``` ### 3. Run the Server ```bash python app.py ``` The server will start at `http://localhost:8000` ## πŸ“‘ API Endpoints ### Health Check ```bash GET /mcp/health ``` ### Process Single Query ```bash POST /mcp/process { "query": "Hello, I need help with my account", "user_id": "123", "priority": "normal" } ``` ### Batch Processing ```bash POST /mcp/batch { "queries": ["Question 1", "Question 2"], "user_id": "123" } ``` ### Get Capabilities ```bash GET /mcp/capabilities ``` ## πŸ”§ Configuration All configuration is done through environment variables in `.env`: ```bash # Required GEMINI_API_KEY=your_key_here # Optional DATABASE_URL=sqlite:///./mcp_server.db RATE_LIMIT_REQUESTS=100 RATE_LIMIT_PERIOD=60 DEBUG=false ``` ## πŸš€ Deployment ### Railway (FREE) 1. Push to GitHub 2. Connect to [Railway](https://railway.app) 3. Add `GEMINI_API_KEY` environment variable 4. Deploy automatically ### Render (FREE) 1. Connect GitHub repo to [Render](https://render.com) 2. Create new Web Service 3. Add `GEMINI_API_KEY` environment variable 4. Deploy ### Docker ```bash docker build -t gemini-mcp-server . docker run -p 8000:8000 -e GEMINI_API_KEY="your_key" gemini-mcp-server ``` ## πŸ“Š Performance - **Response Time**: ~0.5-2 seconds - **Throughput**: 100+ requests/minute - **Concurrency**: Unlimited concurrent batch processing - **Rate Limits**: Configurable per client IP ## πŸ”’ Security - Rate limiting per IP address - Request validation middleware - Environment variable configuration - SQL injection protection - CORS configuration ## 🌍 Multi-Language Support Supports customer queries in: - English (en) - Spanish (es) - French (fr) - German (de) - Italian (it) - Portuguese (pt) - Japanese (ja) - Korean (ko) - Chinese (zh) ## πŸ“ˆ Database Schema The server includes models for: - `ChatMessage` - Store conversations - `User` - User management - `Tenant` - Multi-tenant support ## πŸ”— Frontend Integration ```typescript // React/TypeScript example const response = await fetch('https://your-server.com/mcp/process', { method: 'POST', headers: { 'Content-Type': 'application/json', 'x-mcp-auth': 'your-auth-token' }, body: JSON.stringify({ query: userMessage, user_id: currentUser.id, priority: 'normal' }) }); const data = await response.json(); console.log(data.response); // AI response ``` ## πŸ’° Cost Analysis | Feature | Cost | |---------|------| | **Google Gemini** | FREE | | **Database** | FREE (SQLite) | | **Hosting** | FREE (Railway/Render) | | **Rate Limits** | 15 requests/min, 1M tokens/day | | **Total** | **$0/month** | ## πŸ› οΈ Development ### Setup Development Environment ```bash # Clone repository git clone https://github.com/your-username/gemini-mcp-server.git cd gemini-mcp-server # Create virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install dependencies pip install -r requirements.txt # Copy environment file cp env.example .env # Run in development mode python app.py ``` ### Run Tests ```bash # Install test dependencies pip install pytest pytest-asyncio httpx # Run tests pytest ``` ## πŸ“‹ TODO - [ ] Add authentication system - [ ] Implement WebSocket support - [ ] Add conversation history - [ ] Create admin dashboard - [ ] Add metrics and logging - [ ] Implement caching layer ## 🀝 Contributing 1. Fork the repository 2. Create your feature branch (`git checkout -b feature/amazing-feature`) 3. Commit your changes (`git commit -m 'Add amazing feature'`) 4. Push to the branch (`git push origin feature/amazing-feature`) 5. Open a Pull Request ## πŸ“„ License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. ## πŸ™ Acknowledgments - Google for providing free Gemini AI - FastAPI for the excellent web framework - The MCP community for protocol standards --- **🌟 Star this repository if you found it helpful!** **πŸ”— [Live Demo](https://your-demo-url.com) | [Documentation](https://your-docs-url.com)**

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ChiragPatankar/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server