Provides access to Google's Gemini API, enabling direct interaction with various Gemini models (gemini-pro, gemini-1.5-pro, gemini-1.5-flash) with customizable parameters like temperature and max output tokens, conversation history management, and enhanced web search capabilities.

MCP Gemini Server
This MCP server allows you to access Google's Gemini API directly from Claude Desktop.
Features
Call the Gemini API with customizable parameters
Ask Claude and Gemini to talk to each other in a long running discussion!
Configure model versions, temperature, and other parameters
Access to various Gemini models (gemini-pro, gemini-1.5-pro, gemini-1.5-flash, etc.)
Conversation history management for context-aware responses
Use your own Google AI Studio API key
Setup Instructions
Installing via Smithery
To install Gemini Server for Claude Desktop via Smithery:
After deployment, you should be able to access your MCP server at:
Prerequisites
Node.js 18+ (for Smithery deployment)
Claude Desktop application
Installation for Local Development
Clone this repository:
git clone https://github.com/your-username/gemini_mcp.git cd gemini_mcpInstall dependencies:
npm installSet your Gemini API key in the Smithery dashboard or through the config file.
Deploy to Smithery:
npm run deploy
Using with Claude Desktop
Configure Claude Desktop to use this MCP server by following the instructions at: MCP Quickstart Guide
Add the following configuration to your Claude Desktop settings:
Name: Gemini
API Key: your-gemini-api-key-here
Restart Claude Desktop.
You can now use the Gemini API through Claude by asking questions that mention Gemini.
Available Tools
The MCP server provides the following tools:
ask_gemini(prompt, model, temperature, max_output_tokens, conversation_id)- Send a prompt to Gemini and get a responseask_gemini_with_web_search(prompt, model, temperature, max_output_tokens, conversation_id)- Send a prompt to Gemini with enhanced search instructions
Example Usage
Basic Gemini usage:
Tell Claude to ask Gemini a question!
Tell Claude to have a conversation with Gemini:
Note how the conversation_id allows maintaining conversation history for context-aware responses across multiple interactions.
With search capability:
For questions that may benefit from comprehensive information:
Try search-enhanced responses for planning:
How It Works
This tool utilizes Google's Gemini API with conversation state management. This approach:
Maintains conversation history locally for context tracking
Provides access to various Gemini models (pro, flash, vision, etc.)
Improves the user experience by maintaining context across messages
Allows enhanced responses with search-oriented prompting
License
MIT License
This server cannot be installed