GEMINI_DEPLOYMENT_GUIDE.md•5.27 kB
# 🚀 Gemini MCP Server Deployment Guide
## 📋 Overview
This guide will help you migrate your existing MCP server from Cursor AI + Glama.ai to **Google Gemini** (100% FREE).
---
## 🔄 Migration Steps
### Step 1: Get FREE Gemini API Key
1. Visit [Google AI Studio](https://aistudio.google.com/app/apikey)
2. Sign in with your Google account
3. Click "Create API Key"
4. Copy your API key (starts with `AIza...`)
### Step 2: Update Dependencies
Replace your current `requirements.txt` with `requirements_updated.txt`:
```bash
# Install new dependencies
pip install -r requirements_updated.txt
# Remove old dependencies (if you had them)
pip uninstall glama-ai-client cursor-ai-client
```
### Step 3: Update Environment Variables
Create/update your `.env` file:
```bash
# REQUIRED: Google Gemini API Key (FREE)
GEMINI_API_KEY=AIza...your_key_here
# Keep your existing database and other settings
DATABASE_URL=your_existing_db_url
SERVER_NAME="MCP Chat Support Server"
SERVER_VERSION="2.0.0"
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_PERIOD=60
# REMOVE these old variables:
# GLAMA_API_KEY=xxx
# GLAMA_API_URL=xxx
# CURSOR_API_KEY=xxx
```
### Step 4: Replace Your app.py
Replace your current `app.py` with `app_gemini.py`:
```bash
# Backup your current app.py
cp app.py app_original.py
# Use the new Gemini version
cp app_gemini.py app.py
```
---
## 🎯 What Changed?
### ✅ **Kept Everything You Had:**
- All MCP protocol implementation
- Authentication system
- Rate limiting & middleware
- Database integration
- Batch processing
- Error handling
- Health checks
- All your existing models and endpoints
### 🔄 **What Was Updated:**
- **AI Provider**: Cursor AI → Google Gemini
- **Context Provider**: Glama.ai → Internal context building
- **Dependencies**: Removed paid APIs, added free Gemini
- **Enhanced Features**: Better error handling, concurrent processing
---
## 🚀 Deployment Options
### Option 1: Local Development
```bash
# Clone/navigate to your project
cd your-mcp-server
# Install dependencies
pip install -r requirements_updated.txt
# Set environment variables
export GEMINI_API_KEY="your_key_here"
# Run the server
python app.py
```
### Option 2: Docker Deployment
```dockerfile
# Add to your existing Dockerfile
FROM python:3.9-slim
WORKDIR /app
COPY requirements_updated.txt .
RUN pip install -r requirements_updated.txt
COPY . .
ENV GEMINI_API_KEY=""
EXPOSE 8000
CMD ["python", "app.py"]
```
```bash
# Build and run
docker build -t mcp-gemini-server .
docker run -p 8000:8000 -e GEMINI_API_KEY="your_key" mcp-gemini-server
```
### Option 3: Railway Deployment (FREE)
1. Push your code to GitHub
2. Connect to [Railway](https://railway.app)
3. Add environment variable: `GEMINI_API_KEY`
4. Deploy automatically
### Option 4: Render Deployment (FREE)
1. Connect your GitHub repo to [Render](https://render.com)
2. Create a new Web Service
3. Add environment variable: `GEMINI_API_KEY`
4. Deploy
---
## 🧪 Testing Your Deployment
### Test Basic Functionality
```bash
# Health check
curl http://localhost:8000/mcp/health
# Test capabilities
curl http://localhost:8000/mcp/capabilities
# Test processing (add your auth header)
curl -X POST http://localhost:8000/mcp/process \
-H "Content-Type: application/json" \
-H "x-mcp-auth: your-auth-token" \
-d '{
"query": "Hello, I need help with my account",
"user_id": "123",
"priority": "normal"
}'
```
### Expected Response
```json
{
"response": "Hello! I'd be happy to help you with your account. Could you please tell me more about the specific issue you're experiencing?",
"context": {
"timestamp": "2025-01-11T...",
"query_length": 35,
"language_detected": "en"
},
"metadata": {
"processed_at": "2025-01-11T...",
"model": "gemini-1.5-flash",
"ai_provider": "Google Gemini",
"priority": "normal"
},
"mcp_version": "1.0",
"processing_time": 0.85
}
```
---
## 💰 Cost Comparison
| Service | Before | After |
|---------|--------|-------|
| **Cursor AI** | $20-50/month | **FREE** |
| **Glama.ai** | $15-30/month | **FREE** |
| **Gemini** | N/A | **FREE** |
| **Total** | $35-80/month | **$0/month** |
### Gemini Free Tier Limits:
- ✅ **15 requests per minute**
- ✅ **1 million tokens per day**
- ✅ **1500 requests per day**
- ✅ No credit card required
---
## 🔗 Frontend Integration
Your frontend integration remains **exactly the same**! Just point to your deployed MCP server:
```typescript
// In your React app - no changes needed!
const response = await fetch('https://your-deployed-server.com/mcp/process', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'x-mcp-auth': 'your-auth-token'
},
body: JSON.stringify({
query: userMessage,
user_id: currentUser.id,
priority: 'normal'
})
});
```
---
## 📝 Next Steps for Full SaaS
1. **✅ Backend**: Now you have a fully functional, free backend!
2. **🔌 Connect Frontend**: Integrate with your React app
3. **🔐 Authentication**: Implement user registration/login
4. **💳 Payments**: Add Stripe for paid plans (optional)
5. **📊 Analytics**: Track usage and improve
6. **🚀 Scale**: Add more features as needed
**You're now 90% ready for production!** 🎉