Provides production-ready containerization with health checks for secure deployment of the MCP server
Supports environment-based configuration for API keys and server settings through .env files
Enables version control and repository management for the MCP orchestrator codebase
Integrates with Google's Gemini Pro model as part of the multi-model orchestration for consensus-based reasoning approaches
Integrates with OpenAI's GPT-4 model as part of the multi-model orchestration system for advanced reasoning strategies
Supports testing framework for validating MCP server functionality
Built using Python for implementation of the MCP server and client interactions
Uses YAML for configuration of reasoning strategies and model orchestration settings
MCP Orchestrator
A sophisticated Model Context Protocol (MCP) server that orchestrates external AI models (Gemini 2.5 Pro and O3) to provide additional perspectives and insights when using Claude. The orchestrator exclusively uses external models since users are already interacting with Claude directly.
Architecture Overview
When you interact with Claude, this MCP server provides tools to consult external models for additional perspectives:
- Gemini 2.5 Pro (via OpenRouter): Alternative analysis and perspectives
- O3 (via OpenAI): Architectural and system design insights
Note: The orchestrator does NOT use Claude models since you're already talking to Claude. It exclusively orchestrates external models to enhance your Claude experience.
Features
- External Model Enhancement: Get perspectives from Gemini 2.5 Pro and O3 to supplement Claude's responses
- Network Bridges: REST API (port 5050) and WebSocket (port 8765) for integration with any application
- Advanced Reasoning Strategies: External enhancement and multi-model council approaches
- MCP-Compliant: Full adherence to Model Context Protocol standards
- Secure by Design: Non-root execution, encrypted storage, API key protection
- Docker Support: Production-ready containerization with health checks
- Cost Controls: Built-in request and daily spending limits
- Bug-Free: All known issues fixed including ResponseSynthesizer and lifecycle management
Quick Start
1. Clone and Configure
2. Deploy with Docker
3. Start Network Services (Optional)
4. Use with MCP Clients
The orchestrator exposes 13 MCP tools that allow Claude to get external perspectives:
orchestrate_task
: Get external model perspectives on any taskanalyze_task
: Analyze task complexity with external modelsquery_specific_model
: Query Gemini 2.5 Pro or O3 directlycode_review
: Get external code review perspectivesthink_deeper
: Request deeper analysis from external modelsmulti_model_review
: Get multiple external perspectivescomparative_analysis
: Compare solutions using external models- And more tools for specific use cases
Architecture
The flow:
- User asks Claude a question
- Claude responds directly (primary interaction)
- Claude can optionally use MCP tools to get external perspectives
- MCP Orchestrator queries ONLY external models (Gemini 2.5 Pro and/or O3)
- External insights are integrated into Claude's response
Configuration
Environment Variables
Variable | Description | Default |
---|---|---|
OPENROUTER_API_KEY | Your OpenRouter API key (for Gemini 2.5 Pro) | Required |
OPENAI_API_KEY | Your OpenAI API key (for O3) | Required |
MCP_LOG_LEVEL | Logging level | INFO |
MCP_MAX_COST_PER_REQUEST | Max cost per request ($) | 5.0 |
MCP_DAILY_LIMIT | Daily spending limit ($) | 100.0 |
Strategy Configuration
Edit config/config.yaml
to customize:
Integration Options
REST API
WebSocket
See INTEGRATION_EXAMPLES.md for more examples in various languages.
Development
Local Setup
Testing with Client
Security
- Runs as non-root user in containers
- Read-only filesystem with specific writable volumes
- Encrypted credential storage
- No capabilities beyond essentials
- Resource limits enforced
Monitoring
- JSON structured logging
- Health checks every 30s
- Log rotation (3 files, 10MB each)
- Cost tracking and limits
Troubleshooting
Container won't start
API errors
- Verify API key in
.env
- Check rate limits and quotas
- Review logs for specific errors
Memory issues
- Adjust
mem_limit
in docker-compose.yml - Monitor with
docker stats
License
MIT License - see LICENSE file for details
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A sophisticated server that coordinates multiple LLMs (Claude, Gemini, etc.) using the Model Context Protocol to enhance reasoning capabilities through strategies like progressive deep dive and consensus-based approaches.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that provides LLM Agents with a comprehensive toolset for IP geolocation, network diagnostics, system monitoring, cryptographic operations, and QR code generation.Last updated -1634TypeScriptApache 2.0
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with databases (currently MongoDB) through natural language, supporting operations like querying, inserting, deleting documents, and running aggregation pipelines.Last updated -TypeScriptMIT License
- -securityFlicense-qualityA Model Context Protocol server that enables conversational LLMs to delegate complex research tasks to specialized AI agents powered by various OpenRouter models, coordinated by a Claude orchestrator.Last updated -10JavaScript
- -securityAlicense-qualityA Model Control Protocol server that integrates with Claude Desktop to enable simultaneous querying and cross-checking of responses from multiple LLM providers including OpenAI, Anthropic, Perplexity AI, and Google Gemini.Last updated -7PythonMIT License