Skip to main content
Glama
by lbds137

Gemini MCP Server

A Model Context Protocol (MCP) server that enables Claude to collaborate with Google's Gemini AI models.

Features

  • šŸ¤– Multiple Gemini Tools: Ask questions, review code, brainstorm ideas, generate tests, and get explanations

  • šŸ”„ Dual-Model Support: Automatic fallback from experimental to stable models

  • ⚔ Configurable Models: Easy switching between different Gemini variants

  • šŸ›”ļø Reliable: Never lose functionality with automatic model fallback

  • šŸ“Š Transparent: Shows which model was used for each response

Quick Start

1. Prerequisites

2. Installation

# Clone the repository git clone https://github.com/lbds137/gemini-mcp-server.git cd gemini-mcp-server # Install dependencies pip install -r requirements.txt # Copy and configure environment cp .env.example .env # Edit .env and add your GEMINI_API_KEY

3. Configuration

Edit .env to configure your models:

# Your Gemini API key (required) GEMINI_API_KEY=your_api_key_here # Model configuration (optional - defaults shown) GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05 GEMINI_MODEL_FALLBACK=gemini-1.5-pro GEMINI_MODEL_TIMEOUT=10000

4. Development Setup

For development with PyCharm or other IDEs:

# Create virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install in development mode pip install -e . # Run tests python -m pytest

5. Register with Claude

# Install to MCP location ./scripts/install.sh # Or manually register claude mcp add gemini-collab python3 ~/.claude-mcp-servers/gemini-collab/server.py

Available Tools

ask_gemini

General questions and problem-solving assistance.

gemini_code_review

Get code review feedback focusing on security, performance, and best practices.

gemini_brainstorm

Collaborative brainstorming for architecture and design decisions.

gemini_test_cases

Generate comprehensive test scenarios for your code.

gemini_explain

Get clear explanations of complex code or concepts.

server_info

Check server status and model configuration.

Model Configurations

Best Quality (Default)

GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05 GEMINI_MODEL_FALLBACK=gemini-1.5-pro

Best Performance

GEMINI_MODEL_PRIMARY=gemini-2.5-flash-preview-05-20 GEMINI_MODEL_FALLBACK=gemini-2.0-flash

Most Cost-Effective

GEMINI_MODEL_PRIMARY=gemini-2.0-flash GEMINI_MODEL_FALLBACK=gemini-2.0-flash-lite

Development

Project Structure

gemini-mcp-server/ ā”œā”€ā”€ src/ │ └── gemini_mcp/ │ ā”œā”€ā”€ __init__.py │ └── server.py # Main server with DualModelManager ā”œā”€ā”€ tests/ │ └── test_server.py ā”œā”€ā”€ scripts/ │ ā”œā”€ā”€ install.sh # Quick installation script │ ā”œā”€ā”€ install.sh # Install/update deployment script │ └── dev-link.sh # Development symlink script ā”œā”€ā”€ docs/ │ └── BUILD_YOUR_OWN_MCP_SERVER.md ā”œā”€ā”€ .claude/ │ └── settings.json # Claude Code permissions ā”œā”€ā”€ .env # Your configuration (git-ignored) ā”œā”€ā”€ .env.example # Example configuration ā”œā”€ā”€ .gitignore ā”œā”€ā”€ CLAUDE.md # Instructions for Claude Code ā”œā”€ā”€ LICENSE ā”œā”€ā”€ README.md # This file ā”œā”€ā”€ docs/ │ ā”œā”€ā”€ BUILD_YOUR_OWN_MCP_SERVER.md │ ā”œā”€ā”€ DUAL_MODEL_CONFIGURATION.md # Dual-model setup guide │ ā”œā”€ā”€ PYCHARM_SETUP.md │ └── TESTING.md ā”œā”€ā”€ requirements.txt ā”œā”€ā”€ setup.py ā”œā”€ā”€ package.json # MCP registration metadata └── package-lock.json

Running Tests

python -m pytest tests/ -v

Contributing

  1. Fork the repository

  2. Create a feature branch (git checkout -b feature/amazing-feature)

  3. Commit your changes (git commit -m 'Add amazing feature')

  4. Push to the branch (git push origin feature/amazing-feature)

  5. Open a Pull Request

Updating

To update your local MCP installation after making changes:

./scripts/install.sh

This script intelligently handles both installation and updates.

Troubleshooting

Server not found

# Check registration claude mcp list # Re-register if needed ./scripts/install.sh

API Key Issues

# Verify environment variable echo $GEMINI_API_KEY # Test directly python -c "import google.generativeai as genai; genai.configure(api_key='$GEMINI_API_KEY'); print('āœ… API key works')"

Model Availability

Some models may not be available in all regions. Check the fallback model in logs if primary fails consistently.

License

MIT License - see LICENSE file for details.

Acknowledgments

-
security - not tested
A
license - permissive license
-
quality - not tested

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    A server implementing the Model Context Protocol that enables AI assistants like Claude to interact with Google's Gemini API for text generation, text analysis, and chat conversations.
    Last updated -
    • Linux
    • Apple
  • -
    security
    F
    license
    -
    quality
    A Model Context Protocol server that enables Claude Desktop to interact with Google's Gemini 2.5 Pro Experimental AI model, with features like Google Search integration and token usage reporting.
    Last updated -
    3
  • -
    security
    F
    license
    -
    quality
    A Model Context Protocol server that gives Claude access to Google's Gemini 2.5 Pro for extended thinking, code analysis, and problem-solving with a massive context window.
    Last updated -
    9,728
    • Apple
  • -
    security
    F
    license
    -
    quality
    A Model Context Protocol server that enables Claude to interact with Google's Gemini AI models, allowing users to ask Gemini questions directly from Claude.
    Last updated -
    1

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lbds137/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server