Enables access to Google's Gemini AI models for answering questions, reviewing code, and providing expert guidance in different domains with customizable parameters.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Gemini MCP Serverexplain quantum computing in simple terms"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Gemini MCP Server
A Model Context Protocol (MCP) server that enables Claude to interact with Google's Gemini AI models.
Inspired from this GitHub repo
Setup
Get a Gemini API key from Google AI Studio
Add to Claude Desktop config (
~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"gemini": {
"command": "uvx",
"args": ["--from", "/path/to/gemini-mcp", "gemini-mcp"],
"env": {
"GEMINI_API_KEY": "your_api_key_here",
"MODEL_NAME": "gemini-2.5-flash-preview-05-20"
}
}
}
}Replace /path/to/gemini-mcp with the actual path to this directory.
Related MCP server: Gemini MCP Server
Usage
ask_gemini
Ask Gemini questions directly from Claude.
prompt: Your question for Geminitemperature: Response creativity (0.0-1.0, default: 0.5)context: Additional contextpersona: Role for Gemini (e.g., "senior architect", "security expert")
server_info
Check server status and Gemini availability.
Examples
Ask Gemini: What are the latest trends in machine learning?
Ask Gemini to review this code as a senior developer: [code]
Ask Gemini about Python best practices with context that I'm building a web APITroubleshooting
API key error: Set
GEMINI_API_KEYin your environmentRate limit: Wait and try again
Content filtered: Rephrase your request
Server issues: Use
server_infotool to check status