GemForge-Gemini-Tools-MCP

Integrations

  • Provides specialized tools for interacting with Google's Gemini AI models, featuring intelligent model selection based on task type, advanced file handling capabilities, and optimized prompts for different use cases such as search, reasoning, code analysis, and file operations.

GemForge (Gemini Tools)

GemForge-Gemini-Tools-MCP: Enterprise-grade Gemini integration for your favorite MCP agents. Supercharge Claude, Roo Code, and Windsurf with codebase analysis, live search, text/PDF/image processing, and more.

Quick Navigation

Why GemForge?

GemForge is the essential bridge between Google's Gemini AI and the MCP ecosystem:

  • Real-Time Web Access: Fetch breaking news, market trends, and current data with gemini_search
  • Advanced Reasoning: Process complex logic problems with step-by-step thinking via gemini_reason
  • Code Mastery: Analyze full repositories, generate solutions, and debug code with gemini_code
  • Multi-File Processing: Handle 60+ file formats including PDFs, images, and more with gemini_fileops
  • Intelligent Model Selection: Automatically routes to optimal Gemini model for each task
  • Enterprise-Ready: Robust error handling, rate limit management, and API fallback mechanisms

Quick Start

One-Line Install

npx @gemforge/mcp-server@latest init

Manual Setup

  1. Create configuration file (claude_desktop_config.json):
{ "mcpServers": { "GemForge": { "command": "node", "args": ["./dist/index.js"], "env": { "GEMINI_API_KEY": "your_api_key_here" } } } }
  1. Install and run:
npm install gemforge-mcp npm start

Watch 30-second setup demo →

Heavy-Duty Reliability

GemForge is built for production environments:

  • Support for 60+ File Types: Process everything from code to documents to images
  • Automatic Model Fallbacks: Continues functioning even during rate limits or service disruptions
  • Enterprise-Grade Error Logging: Detailed diagnostics for troubleshooting
  • API Resilience: Exponential backoff, retry logic, and seamless model switching
  • Full Repository Support: Analyze entire codebases with configurable inclusion/exclusion patterns
  • XML Content Processing: Specialized handling for structured data

Key Tools

ToolDescriptionKey Capability
gemini_searchWeb-connected information retrievalReal-time data access
gemini_reasonComplex problem solving with step-by-step logicTransparent reasoning process
gemini_codeDeep code understanding and generationFull repository analysis
gemini_fileopsMulti-file processing across 60+ formatsDocument comparison and transformation
{ "toolName": "gemini_search", "toolParams": { "query": "Latest advancements in quantum computing", "enable_thinking": true } }
{ "toolName": "gemini_code", "toolParams": { "question": "Identify improvements and new features", "directory_path": "path/to/project", "repomix_options": "--include \"**/*.js\" --no-gitignore" } }
{ "toolName": "gemini_fileops", "toolParams": { "file_path": ["contract_v1.pdf", "contract_v2.pdf"], "operation": "analyze", "instruction": "Compare these contract versions and extract all significant changes." } }

Configuration

GemForge offers flexible configuration options:

GEMINI_API_KEY=your_api_key_here # Required: Gemini API key GEMINI_PAID_TIER=true # Optional: Set to true if using paid tier (better rate limits) DEFAULT_MODEL_ID=gemini-2.5-pro # Optional: Override default model selection LOG_LEVEL=info # Optional: Set logging verbosity (debug, info, warn, error)
{ "mcpServers": { "GemForge": { "command": "node", "args": ["./dist/index.js"], "env": { "GEMINI_API_KEY": "your_api_key_here" } } } }

GemForge intelligently selects the best model for each task:

  • gemini_search: Uses gemini-2.5-flash for speed and search integration
  • gemini_reason: Uses gemini-2.5-pro for deep reasoning capabilities
  • gemini_code: Uses gemini-2.5-pro for complex code understanding
  • gemini_fileops: Selects between gemini-2.0-flash-lite or gemini-1.5-pro based on file size

Override with model_id parameter in any tool call or set DEFAULT_MODEL_ID environment variable.

Deployment

Smithery.ai

One-click deployment via Smithery.ai

Docker

docker run -e GEMINI_API_KEY=your_api_key ghcr.io/pv-bhat/gemforge:latest

Self-Hosted

Use our MCP.so Directory listing for integration instructions.

What Sets GemForge Apart?

  • Cross-Ecosystem Power: Bridge Google's AI with Claude and other MCP agents
  • Multi-File Analysis: Compare documents, images, or code versions
  • Smart Routing: Automatic model selection based on task requirements
  • Production-Ready: Built for enterprise environments

Community & Support

Documentation

Visit our Documentation Site for:

  • Advanced usage tutorials
  • API reference
  • Troubleshooting tips

License

Licensed under the MIT License. See LICENSE for details.

Acknowledgments

Powered by the Gemini API and inspired by the Model Context Protocol.

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

MCP server to empower your agent with enterprise-grade Gemini integration for codebase analysis, live search, text/PDF/image processing, and more on your favorite MCP clients.

  1. Quick Navigation
    1. Why GemForge?
      1. Quick Start
        1. One-Line Install
        2. Manual Setup
      2. Heavy-Duty Reliability
        1. Key Tools
          1. Configuration
            1. Deployment
              1. Smithery.ai
              2. Docker
              3. Self-Hosted
            2. What Sets GemForge Apart?
              1. Community & Support
                1. Documentation
                  1. License
                    1. Acknowledgments

                      Related MCP Servers

                      • -
                        security
                        A
                        license
                        -
                        quality
                        The ultimate Gemini API interface for MCP hosts, intelligently selecting models for the task at hand—delivering optimal performance, minimal token cost, and seamless integration.
                        Last updated -
                        6
                        TypeScript
                        MIT License
                      • A
                        security
                        A
                        license
                        A
                        quality
                        A metacognitive pattern interrupt system that helps prevent AI assistants from overcomplicated reasoning paths by providing external validation, simplification guidance, and learning mechanisms.
                        Last updated -
                        3
                        39
                        TypeScript
                        MIT License
                        • Apple
                      • -
                        security
                        -
                        license
                        -
                        quality
                        An MCP server implementation that allows using Google's Gemini AI models (specifically Gemini 1.5 Pro) through Claude or other MCP clients via the Model Context Protocol.
                        Last updated -
                        1
                        JavaScript

                      View all related MCP servers

                      ID: lv5jlwc49z