Provides access to Google Gemini AI models for text generation with configurable model selection and temperature settings.
Gemini MCP Server
An MCP (Model Context Protocol) server that provides access to Google Gemini AI models.
Quick Start
Install dependencies:
Create a
.envfile with your Gemini API key:
Start the server:
The server will run at http://localhost:3333/mcp
Available Tool
gemini.generateText
Generate text using Google Gemini models.
Parameters:
prompt(string, required): The text promptmodel(string, optional): Gemini model to use (default:gemini-2.5-pro)temperature(number, optional): Temperature for generation, 0-2 (default: 1)
Returns:
text: Generated text responsemodel: Model usedtemperature: Temperature setting used
Usage Example
Testing
Run the included test client (requires server to be running):
Configuration
Environment variables:
GEMINI_API_KEY(required): Your Google Gemini API keyPORT(optional): Server port (default: 3333)
Deployment
Deploy to Google Cloud Run with automated scripts:
For detailed deployment instructions, see DEPLOYMENT.md.
License
ISC