Provides access to Google Gemini AI models for text generation with configurable model selection and temperature settings.
Gemini MCP Server
An MCP (Model Context Protocol) server that provides access to Google Gemini AI models.
Quick Start
Install dependencies:
Create a
.envfile with your Gemini API key:
Start the server:
The server will run at http://localhost:3333/mcp
Available Tool
gemini.generateText
Generate text using Google Gemini models.
Parameters:
prompt(string, required): The text promptmodel(string, optional): Gemini model to use (default:gemini-2.5-pro)temperature(number, optional): Temperature for generation, 0-2 (default: 1)
Returns:
text: Generated text responsemodel: Model usedtemperature: Temperature setting used
Usage Example
Testing
Run the included test client (requires server to be running):
Configuration
Environment variables:
GEMINI_API_KEY(required): Your Google Gemini API keyPORT(optional): Server port (default: 3333)
License
ISC
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables interaction with Google Gemini AI models through MCP protocol. Provides text generation capabilities with configurable model selection and temperature settings.