Provides web search capabilities for agents to gather current information during research queries without requiring an API key.
Enables parallel execution of Google's Gemini models as research agents for multi-perspective analysis and synthesis of responses.
Enables parallel execution of OpenAI's GPT models as research agents for multi-perspective analysis and synthesis of responses.
๐ง ThinkingCap
A multi-agent research MCP server that runs multiple LLM providers in parallel and synthesizes their responses. Built on the Model Context Protocol for seamless integration with Claude Desktop, Cursor, and other MCP-compatible tools.
๐ Features
๐ Multi-Agent Research: Deploy multiple AI agents simultaneously for comprehensive analysis
๐ฏ Multi-Provider Support: OpenAI, Anthropic, xAI, Google, OpenRouter, Groq, Cerebras
โก Parallel Execution: All agents run concurrently for maximum speed
๐ Intelligent Synthesis: Combines multiple perspectives into unified, comprehensive answers
๐ Built-in Web Search: DuckDuckGo search integration (no API key required)
๐ MCP Native: Works with any MCP-compatible client via
npx
๐ Quick Start
Installation
No installation required! Just add to your MCP client configuration.
Configuration
Add the following to your MCP client configuration (e.g., ~/.cursor/mcp.json):
Customizing Agents
You can specify any combination of providers and models as arguments:
๐ Supported Providers
Provider | Env Variable | Default Model | Example |
|
| gpt-5.1 |
|
|
| moonshotai/kimi-k2-thinking |
|
|
| moonshotai/kimi-k2-instruct-0905 |
|
|
| zai-glm-4.6 |
|
|
| grok-4-fast |
|
|
| claude-opus-4-5 |
|
|
| gemini-3-pro-preview |
|
๐ Environment Variables
API keys are read from environment variables. Add them to your ~/.bashrc or ~/.zshrc:
๐ ๏ธ How It Works
Query Decomposition: Your research query is broken into multiple specialized questions
Parallel Execution: Each agent (provider/model combo) researches a different angle
Web Search: Each agent performs web searches to gather current information
Synthesis: All agent responses are combined into one comprehensive answer
๐ฅ OpenRouter Fireworks Routing
When using OpenRouter, requests are automatically routed to Fireworks as the preferred provider with fallbacks enabled for maximum reliability.
๐ License
MIT License
๐ Acknowledgments
Built on the Model Context Protocol
Inspired by multi-agent AI research systems
This server cannot be installed