Skip to main content
Glama
lbds137
by lbds137

Gemini MCP Server

A Model Context Protocol (MCP) server that enables Claude to collaborate with Google's Gemini AI models.

Features

  • πŸ€– Multiple Gemini Tools: Ask questions, review code, brainstorm ideas, generate tests, and get explanations

  • πŸ”„ Dual-Model Support: Automatic fallback from experimental to stable models

  • ⚑ Configurable Models: Easy switching between different Gemini variants

  • πŸ›‘οΈ Reliable: Never lose functionality with automatic model fallback

  • πŸ“Š Transparent: Shows which model was used for each response

Related MCP server: Gemini MCP Server

Quick Start

1. Prerequisites

2. Installation

# Clone the repository git clone https://github.com/lbds137/gemini-mcp-server.git cd gemini-mcp-server # Install dependencies pip install -r requirements.txt # Copy and configure environment cp .env.example .env # Edit .env and add your GEMINI_API_KEY

3. Configuration

Edit .env to configure your models:

# Your Gemini API key (required) GEMINI_API_KEY=your_api_key_here # Model configuration (optional - defaults shown) GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05 GEMINI_MODEL_FALLBACK=gemini-1.5-pro GEMINI_MODEL_TIMEOUT=10000

4. Development Setup

For development with PyCharm or other IDEs:

# Create virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install in development mode pip install -e . # Run tests python -m pytest

5. Register with Claude

# Install to MCP location ./scripts/install.sh # Or manually register claude mcp add gemini-collab python3 ~/.claude-mcp-servers/gemini-collab/server.py

Available Tools

ask_gemini

General questions and problem-solving assistance.

gemini_code_review

Get code review feedback focusing on security, performance, and best practices.

gemini_brainstorm

Collaborative brainstorming for architecture and design decisions.

gemini_test_cases

Generate comprehensive test scenarios for your code.

gemini_explain

Get clear explanations of complex code or concepts.

server_info

Check server status and model configuration.

Model Configurations

Best Quality (Default)

GEMINI_MODEL_PRIMARY=gemini-2.5-pro-preview-06-05 GEMINI_MODEL_FALLBACK=gemini-1.5-pro

Best Performance

GEMINI_MODEL_PRIMARY=gemini-2.5-flash-preview-05-20 GEMINI_MODEL_FALLBACK=gemini-2.0-flash

Most Cost-Effective

GEMINI_MODEL_PRIMARY=gemini-2.0-flash GEMINI_MODEL_FALLBACK=gemini-2.0-flash-lite

Development

Project Structure

gemini-mcp-server/ β”œβ”€β”€ src/ β”‚ └── gemini_mcp/ β”‚ β”œβ”€β”€ __init__.py β”‚ └── server.py # Main server with DualModelManager β”œβ”€β”€ tests/ β”‚ └── test_server.py β”œβ”€β”€ scripts/ β”‚ β”œβ”€β”€ install.sh # Quick installation script β”‚ β”œβ”€β”€ install.sh # Install/update deployment script β”‚ └── dev-link.sh # Development symlink script β”œβ”€β”€ docs/ β”‚ └── BUILD_YOUR_OWN_MCP_SERVER.md β”œβ”€β”€ .claude/ β”‚ └── settings.json # Claude Code permissions β”œβ”€β”€ .env # Your configuration (git-ignored) β”œβ”€β”€ .env.example # Example configuration β”œβ”€β”€ .gitignore β”œβ”€β”€ CLAUDE.md # Instructions for Claude Code β”œβ”€β”€ LICENSE β”œβ”€β”€ README.md # This file β”œβ”€β”€ docs/ β”‚ β”œβ”€β”€ BUILD_YOUR_OWN_MCP_SERVER.md β”‚ β”œβ”€β”€ DUAL_MODEL_CONFIGURATION.md # Dual-model setup guide β”‚ β”œβ”€β”€ PYCHARM_SETUP.md β”‚ └── TESTING.md β”œβ”€β”€ requirements.txt β”œβ”€β”€ setup.py β”œβ”€β”€ package.json # MCP registration metadata └── package-lock.json

Running Tests

python -m pytest tests/ -v

Contributing

  1. Fork the repository

  2. Create a feature branch (git checkout -b feature/amazing-feature)

  3. Commit your changes (git commit -m 'Add amazing feature')

  4. Push to the branch (git push origin feature/amazing-feature)

  5. Open a Pull Request

Updating

To update your local MCP installation after making changes:

./scripts/install.sh

This script intelligently handles both installation and updates.

Troubleshooting

Server not found

# Check registration claude mcp list # Re-register if needed ./scripts/install.sh

API Key Issues

# Verify environment variable echo $GEMINI_API_KEY # Test directly python -c "import google.generativeai as genai; genai.configure(api_key='$GEMINI_API_KEY'); print('βœ… API key works')"

Model Availability

Some models may not be available in all regions. Check the fallback model in logs if primary fails consistently.

License

MIT License - see LICENSE file for details.

Acknowledgments

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lbds137/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server