Leverages Gemini 2.5 Pro's 1M token context window and code execution capabilities for distributed system debugging, long-trace analysis, performance modeling, and hypothesis testing of code behavior.
Analyzes traces captured with Jaeger, allowing for timeline capture and debugging of distributed systems through trace correlation.
Processes traces captured with OpenTelemetry for analysis across multiple services, enabling correlation of trace spans and debugging of distributed systems.
Deep Code Reasoning MCP Server
An MCP server that pairs Claude Code with Google's Gemini AI for complementary code analysis. This server enables a multi-model workflow where Claude Code handles tight terminal integration and multi-file refactoring, while Gemini leverages its massive context window (1M tokens) and code execution capabilities for distributed system debugging and long-trace analysis.
Core Value
Both Claude and Gemini can handle deep semantic reasoning and distributed system bugs. This server enables an intelligent routing strategy where:
- Claude Code excels at local-context operations, incremental patches, and CLI-native workflows
- Gemini 2.5 Pro shines with huge-context sweeps, synthetic test execution, and analyzing failures that span logs + traces + code
The "escalation" model treats LLMs like heterogeneous microservices - route to the one that's most capable for each sub-task.
Features
- Gemini 2.5 Pro Preview: Uses Google's latest Gemini 2.5 Pro Preview (05-06) model with 1M token context window
- Conversational Analysis: NEW! AI-to-AI dialogues between Claude and Gemini for iterative problem-solving
- Execution Flow Tracing: Understands data flow and state transformations, not just function calls
- Cross-System Impact Analysis: Models how changes propagate across service boundaries
- Performance Modeling: Identifies N+1 patterns, memory leaks, and algorithmic bottlenecks
- Hypothesis Testing: Tests theories about code behavior with evidence-based validation
- Long Context Support: Leverages Gemini 2.5 Pro Preview's 1M token context for analyzing large codebases
Prerequisites
- Node.js 18 or later
- A Google Cloud account with Gemini API access
- Gemini API key from Google AI Studio
Key Dependencies
- @google/generative-ai: Google's official SDK for Gemini API integration
- @modelcontextprotocol/sdk: MCP protocol implementation for Claude integration
- zod: Runtime type validation for tool parameters
- dotenv: Environment variable management
Installation
- Clone the repository:
- Install dependencies:
- Set up your Gemini API key:
- Build the project:
Configuration
Environment Variables
GEMINI_API_KEY
(required): Your Google Gemini API key
Claude Desktop Configuration
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
):
How It Works
- Claude Code performs initial analysis using its strengths in multi-file refactoring and test-driven loops
- When beneficial, Claude escalates to this MCP server - particularly for:
- Analyzing gigantic log/trace dumps that exceed Claude's context
- Running iterative hypothesis testing with code execution
- Correlating failures across many microservices
- Server prepares comprehensive context including code, logs, and traces
- Gemini analyzes with its 1M-token context and visible "thinking" traces
- Results returned to Claude Code for implementation of fixes
Available Tools
Note: The tool parameters use snake_case naming convention and are validated using Zod schemas. The actual implementation provides more detailed type safety than shown in these simplified examples. Full TypeScript type definitions are available in src/models/types.ts
.
Conversational Analysis Tools
The server now includes AI-to-AI conversational tools that enable Claude and Gemini to engage in multi-turn dialogues for complex analysis:
start_conversation
Initiates a conversational analysis session between Claude and Gemini.
continue_conversation
Continues an active conversation with Claude's response or follow-up question.
finalize_conversation
Completes the conversation and generates structured analysis results.
get_conversation_status
Checks the status and progress of an ongoing conversation.
Traditional Analysis Tools
escalate_analysis
Main tool for handing off complex analysis from Claude Code to Gemini.
trace_execution_path
Deep execution analysis with Gemini's semantic understanding.
cross_system_impact
Analyze impacts across service boundaries.
performance_bottleneck
Deep performance analysis beyond simple profiling.
hypothesis_test
Test specific theories about code behavior.
Example Use Cases
Conversational Analysis Example
When Claude needs deep iterative analysis with Gemini:
Case 1: Distributed Trace Analysis
When a failure signature spans multiple services with GB of logs:
Case 2: Performance Regression Hunting
When performance degrades but the cause isn't obvious:
Case 3: Hypothesis-Driven Debugging
When you have theories but need extensive testing:
Development
Architecture
Security Considerations
- API Key: Store your Gemini API key securely in environment variables
- Code Access: The server reads local files - ensure proper file permissions
- Data Privacy: Code is sent to Google's Gemini API - review their data policies
Troubleshooting
"GEMINI_API_KEY not found"
- Ensure you've set the
GEMINI_API_KEY
in your.env
file or environment - Check that the
.env
file is in the project root
"File not found" errors
- Verify that file paths passed to the tools are absolute paths
- Check file permissions
Gemini API errors
- Verify your API key is valid and has appropriate permissions
- Check API quotas and rate limits
- Ensure your Google Cloud project has the Gemini API enabled
Validation errors
- The server uses Zod for parameter validation
- Ensure all required parameters are provided
- Check that parameter names use snake_case (e.g.,
claude_context
, notclaudeContext
) - Review error messages for specific validation requirements
Best Practices for Multi-Model Debugging
When debugging distributed systems with this MCP server:
- Capture the timeline first - Use OpenTelemetry/Jaeger traces with request IDs
- Start with Claude Code - Let it handle the initial investigation and quick fixes
- Escalate strategically to Gemini when you need:
- Analysis of traces spanning 100s of MB
- Correlation across 10+ services
- Iterative hypothesis testing with code execution
- Combine with traditional tools:
go test -race
, ThreadSanitizer for race detection- rr or JFR for deterministic replay
- TLA+ or Alloy for formal verification
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Author
Jonathan Haas - GitHub Profile
Acknowledgments
- Built for integration with Anthropic's Claude Code
- Powered by Google's Gemini AI
- Uses the Model Context Protocol (MCP) for communication
Support
If you encounter any issues or have questions:
- Open an issue on GitHub Issues
- Check the troubleshooting section above
- Review the MCP documentation
You must be authenticated.
Tools
Pairs Claude Code with Google's Gemini AI for complementary code analysis, enabling intelligent routing where Claude handles local-context operations while Gemini leverages its 1M token context for distributed system debugging and long-trace analysis.
- Core Value
- Features
- Prerequisites
- Key Dependencies
- Installation
- Configuration
- How It Works
- Available Tools
- Example Use Cases
- Development
- Architecture
- Security Considerations
- Troubleshooting
- Best Practices for Multi-Model Debugging
- Contributing
- License
- Author
- Acknowledgments
- Support
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityA TypeScript server that integrates Google's Gemini Pro model with Claude Desktop through the Model Context Protocol, allowing Claude users to access Gemini's text generation capabilities.Last updated -15TypeScriptMIT License
- -securityFlicense-qualityA server implementing the Model Context Protocol that enables AI assistants like Claude to interact with Google's Gemini API for text generation, text analysis, and chat conversations.Last updated -Python
- -securityFlicense-qualityA Model Context Protocol server that enables Claude Desktop to interact with Google's Gemini 2.5 Pro Experimental AI model, with features like Google Search integration and token usage reporting.Last updated -JavaScript
- -securityAlicense-qualityConnects Claude Code with multiple AI models (Gemini, Grok-3, ChatGPT, DeepSeek) simultaneously, allowing users to get diverse AI perspectives, conduct AI debates, and leverage each model's unique strengths.Last updated -7PythonMIT License