Skip to main content
Glama

Gemini MCP Server

by lbds137

Gemini MCP Server

A Model Context Protocol (MCP) server that enables Claude to collaborate with Google's Gemini AI models.

Features

  • 🤖 Multiple Gemini Tools: Ask questions, review code, brainstorm ideas, generate tests, and get explanations
  • 🔄 Dual-Model Support: Automatic fallback from experimental to stable models
  • Configurable Models: Easy switching between different Gemini variants
  • 🛡️ Reliable: Never lose functionality with automatic model fallback
  • 📊 Transparent: Shows which model was used for each response

Quick Start

1. Prerequisites

2. Installation

# Clone the repository git clone https://github.com/lbds137/gemini-mcp-server.git cd gemini-mcp-server # Install dependencies pip install -r requirements.txt # Copy and configure environment cp .env.example .env # Edit .env and add your GEMINI_API_KEY

3. Configuration

Edit .env to configure your models:

# Your Gemini API key (required) GEMINI_API_KEY=your_api_key_here # Model configuration (optional - defaults shown) GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05 GEMINI_MODEL_FALLBACK=gemini-1.5-pro GEMINI_MODEL_TIMEOUT=10000

4. Development Setup

For development with PyCharm or other IDEs:

# Create virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install in development mode pip install -e . # Run tests python -m pytest

5. Register with Claude

# Install to MCP location ./scripts/install.sh # Or manually register claude mcp add gemini-collab python3 ~/.claude-mcp-servers/gemini-collab/server.py

Available Tools

ask_gemini

General questions and problem-solving assistance.

gemini_code_review

Get code review feedback focusing on security, performance, and best practices.

gemini_brainstorm

Collaborative brainstorming for architecture and design decisions.

gemini_test_cases

Generate comprehensive test scenarios for your code.

gemini_explain

Get clear explanations of complex code or concepts.

server_info

Check server status and model configuration.

Model Configurations

Best Quality (Default)

GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05 GEMINI_MODEL_FALLBACK=gemini-1.5-pro

Best Performance

GEMINI_MODEL_PRIMARY=gemini-2.5-flash-preview-05-20 GEMINI_MODEL_FALLBACK=gemini-2.0-flash

Most Cost-Effective

GEMINI_MODEL_PRIMARY=gemini-2.0-flash GEMINI_MODEL_FALLBACK=gemini-2.0-flash-lite

Development

Project Structure

gemini-mcp-server/ ├── src/ │ └── gemini_mcp/ │ ├── __init__.py │ └── server.py # Main server with DualModelManager ├── tests/ │ └── test_server.py ├── scripts/ │ ├── install.sh # Quick installation script │ ├── update.sh # Update deployment script │ └── dev-link.sh # Development symlink script ├── docs/ │ └── BUILD_YOUR_OWN_MCP_SERVER.md ├── .claude/ │ └── settings.json # Claude Code permissions ├── .env # Your configuration (git-ignored) ├── .env.example # Example configuration ├── .gitignore ├── CLAUDE.md # Instructions for Claude Code ├── LICENSE ├── README.md # This file ├── docs/ │ ├── BUILD_YOUR_OWN_MCP_SERVER.md │ ├── DUAL_MODEL_CONFIGURATION.md # Dual-model setup guide │ ├── PYCHARM_SETUP.md │ └── TESTING.md ├── requirements.txt ├── setup.py ├── package.json # MCP registration metadata └── package-lock.json

Running Tests

python -m pytest tests/ -v

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Updating

To update your local MCP installation after making changes:

./scripts/update.sh

This will copy the latest version to your MCP servers directory.

Troubleshooting

Server not found

# Check registration claude mcp list # Re-register if needed ./scripts/install.sh

API Key Issues

# Verify environment variable echo $GEMINI_API_KEY # Test directly python -c "import google.generativeai as genai; genai.configure(api_key='$GEMINI_API_KEY'); print('✅ API key works')"

Model Availability

Some models may not be available in all regions. Check the fallback model in logs if primary fails consistently.

License

MIT License - see LICENSE file for details.

Acknowledgments

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A Model Context Protocol server that enables Claude to collaborate with Google's Gemini AI models, providing tools for question answering, code review, brainstorming, test generation, and explanations.

  1. Features
    1. Quick Start
      1. Prerequisites
      2. Installation
      3. Configuration
      4. Development Setup
      5. Register with Claude
    2. Available Tools
      1. ask_gemini
      2. gemini_code_review
      3. gemini_brainstorm
      4. gemini_test_cases
      5. gemini_explain
      6. server_info
    3. Model Configurations
      1. Best Quality (Default)
      2. Best Performance
      3. Most Cost-Effective
    4. Development
      1. Project Structure
      2. Running Tests
      3. Contributing
    5. Updating
      1. Troubleshooting
        1. Server not found
        2. API Key Issues
        3. Model Availability
      2. License
        1. Acknowledgments

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            Model Context Protocol (MCP) server implementation that enables Claude Desktop to interact with Google's Gemini AI models.
            Last updated -
            53
            TypeScript
            MIT License
            • Apple
            • Linux
          • -
            security
            F
            license
            -
            quality
            A server implementing the Model Context Protocol that enables AI assistants like Claude to interact with Google's Gemini API for text generation, text analysis, and chat conversations.
            Last updated -
            Python
            • Linux
            • Apple
          • -
            security
            -
            license
            -
            quality
            An MCP server implementation that allows using Google's Gemini AI models (specifically Gemini 1.5 Pro) through Claude or other MCP clients via the Model Context Protocol.
            Last updated -
            1
            JavaScript
          • -
            security
            F
            license
            -
            quality
            A Model Context Protocol server that enables Claude Desktop to interact with Google's Gemini 2.5 Pro Experimental AI model, with features like Google Search integration and token usage reporting.
            Last updated -
            JavaScript

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/lbds137/gemini-mcp-server'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server