Provides web search capabilities for retrieving current information through DuckDuckGo's search engine
Supports loading environment variables from .env files for local development and testing
Provides Git repository tools and operations, including repository management, status checking, and commit history analysis
Official GitHub API integration for repository management and operations, requires a GitHub Personal Access Token (GITHUB_TOKEN)
Leverages Gemini's large token context capabilities (1M+ tokens) for extensive context analysis
Enables browser automation for web interactions and testing as an alternative automation solution
Offers Slack integration capabilities when configured with appropriate Slack Bot and App tokens
Orchestrator MCP
An intelligent MCP (Model Context Protocol) server that orchestrates multiple MCP servers and provides AI-enhanced workflow automation with production-ready context engine capabilities.
🌟 Features
Core Orchestration
- Multi-Server Orchestration: Connect to multiple MCP servers simultaneously
- Universal Compatibility: Works with npm, uvx, Python, and other MCP server types
- Server Management: Dynamic server discovery and health monitoring
- Scalable Architecture: Easy to add new servers and capabilities
🧠 AI Enhancement Layer
- Intelligent Tool Routing: AI analyzes requests and selects optimal tools
- Workflow Automation: Multi-step processes orchestrated automatically
- Intent Understanding: Natural language request analysis and planning
- Context Synthesis: Combines results from multiple tools into coherent responses
- Result Enhancement: AI improves and formats outputs for better user experience
🎯 Context Engine (PRODUCTION READY!)
- Large Context Analysis: Process 50K+ characters using Gemini's 1M+ token context
- Intelligent Code Understanding: AI-powered codebase analysis with 95% confidence
- Real-time File Discovery: Dynamic file loading and relationship mapping
- Quality Assessment: Identify placeholder vs real implementations
- Performance Optimized: 30s execution time for complex analysis
Built-in Capabilities
- Web Search: DuckDuckGo search for current information
- Fallback Mode: Graceful degradation when AI is not available
Current Status
🎉 PRODUCTION READY - Context Engine Complete!
✅ Context Engine: 85.7% quality score, 95% analysis confidence ✅ AI Enhancement Layer: Complete with intelligent routing and workflow automation ✅ Multi-Server Orchestration: 6/6 MCP servers connected and functional
🚀 Quick Start
- Install dependencies:
- Build the project:
- Configure in your MCP client (e.g., Claude Desktop, VS Code):See the example configuration files in the
examples/
directory:examples/claude-desktop-config.json
- For Claude Desktopexamples/vscode-mcp.json
- For VS Code
- Start using the orchestrator through your MCP client!
MCP Integration
For Stdio MCP Server:
- Name:
Orchestrator MCP
- Command:
node
- Arguments:
/path/to/orchestrator-mcp/dist/index.js
For Development:
- Command:
npx
- Arguments:
orchestrator-mcp
(after publishing to npm)
🛠️ Available Tools
Core AI Enhancement Tools
The orchestrator exposes a minimal set of tools focused on unique capabilities that enhance AI assistants:
ai_process
- Primary Interface - Process requests using AI orchestration with intelligent tool selectionget_info
- System introspection - Get information about connected servers and available capabilitiesai_status
- Health monitoring - Get the status of AI orchestration capabilities
Connected Server Tools
All tools from connected MCP servers are automatically available through AI orchestration:
- Filesystem operations (read, write, search files)
- Git operations (repository management, status, history)
- Memory system (knowledge graph storage)
- Web search (DuckDuckGo search for current information)
- Browser automation (Puppeteer for web scraping and automation)
- Sequential thinking (Dynamic problem-solving through thought sequences)
- And more...
🔗 Connected Servers
Currently enabled servers:
- filesystem (npm) - File operations with secure access controls
- sequential-thinking (npm) - Dynamic problem-solving through thought sequences
- git (uvx) - Git repository tools and operations
- memory (npm) - Knowledge graph-based persistent memory
- puppeteer (npm) - Browser automation and web scraping
- duckduckgo-search (npm) - Privacy-focused web search
🤖 AI Configuration
To enable AI features, you need an OpenRouter API key. Additional API keys can be configured for enhanced integrations:
- Required for AI features: Get an API key from OpenRouter
- Optional integrations: None currently required - all enabled servers work without additional API keys
- Configure the API keys in your MCP client settings:For Claude Desktop (
~/.claude_desktop_config.json
):For VS Code (.vscode/mcp.json
):
AI Models Supported
The orchestrator works with any model available on OpenRouter, including:
- Anthropic Claude (recommended)
- OpenAI GPT models
- Meta Llama models
- Google Gemini models
- And many more!
📖 Usage Examples
🎯 Context Engine (Production Ready!)
Primary AI Interface
System Introspection
AI-Enhanced Workflows
The ai_process
tool can handle complex requests like:
- "Analyze my project structure and suggest improvements"
- "Find recent commits and create a summary"
- "Search for TODO comments and organize them by priority"
- "Take a screenshot of the homepage and analyze its performance"
🏗️ Architecture
Multi-Runtime Support
The orchestrator uses a registry-based architecture supporting:
- npm servers: TypeScript/JavaScript servers via npx
- uvx servers: Python servers via uvx
- Built-in tools: Native orchestrator capabilities
AI Enhancement Layer
⚙️ Configuration
Server Configuration
Server configurations are managed in src/orchestrator/server-configs.ts
. Each server includes:
- Runtime environment (npm, uvx, python, etc.)
- Command and arguments
- Environment requirements
- Enable/disable status
- Development phase assignment
Environment Variables
All environment variables are configured through your MCP client settings. The following variables are supported:
AI Configuration (OpenRouter):
OPENROUTER_API_KEY
(required for AI features) - Your OpenRouter API keyOPENROUTER_DEFAULT_MODEL
(optional) - Default model to use (default: "anthropic/claude-3.5-sonnet")OPENROUTER_MAX_TOKENS
(optional) - Maximum tokens per request (default: "2000")OPENROUTER_TEMPERATURE
(optional) - Temperature for AI responses (default: "0.7")
MCP Server Integrations: All currently enabled MCP servers work without additional API keys or configuration.
🔧 Development
Scripts
npm run build
- Build the projectnpm run dev
- Watch mode for development (TypeScript compilation)npm run start
- Start the server (for MCP client use)npm run start:dev
- Start with .env file support (for local development/testing)npm test
- Run tests (when available)
Local Development
For local development and testing, you can use the development script that loads environment variables from a .env
file:
- Copy the example environment file:
- Edit
.env
with your actual API keys - Run the development server:
Note: The regular npm start
command is intended for MCP client use and expects environment variables to be provided by the MCP client configuration.
📝 License
MIT
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
An intelligent MCP server that orchestrates multiple MCP servers with AI-enhanced workflow automation and production-ready context engine capabilities for codebase analysis.
Related MCP Servers
- -securityAlicense-qualityAn MCP server implementation that standardizes how AI applications access tools and context, providing a central hub that manages tool discovery, execution, and context management with a simplified configuration system.Last updated -12PythonMIT License
- AsecurityFlicenseAqualityAn MCP server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.Last updated -1159TypeScript
- AsecurityAlicenseAqualityA powerful MCP server that provides interactive user feedback and command execution capabilities for AI-assisted development, featuring a graphical interface with text and image support.Last updated -132PythonMIT License
- AsecurityFlicenseAqualityAn MCP server that enhances AI agents' coding capabilities by providing zero hallucinations, improved code quality, security-first approach, high test coverage, and efficient context management.Last updated -15669TypeScript