Incorporates Google's Gemini models (including Gemini Pro) into the multi-model collaboration system, enabling task distribution across different AI providers for balanced workload handling.
Enables multi-model collaboration by integrating OpenAI's GPT models (including GPT-4o and GPT-3.5-turbo) as team members that can be assigned specific tasks like code optimization, bug fixes, and code reviews.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Claude Team MCPoptimize this SQL query for performance"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
๐ค Claude Team
Multi-Agent MCP Server for AI-Powered Development Teams
Orchestrate GPT, Claude, Gemini and more to collaborate on complex tasks
โจ Features
Feature | Description |
๐ค Multi-Model Collaboration | Configure multiple AI models to work together, each leveraging their strengths |
๐ง Smart Task Distribution | Tech Lead analyzes tasks and automatically assigns them to the best-suited experts |
๐ Workflow Templates | 5 pre-built workflows: code generation, bug fixing, refactoring, review, documentation |
๐ฏ Custom Experts | Define your own experts (Rust, K8s, Security, etc.) via environment variables |
๐ Observability | Dashboard, cost estimation, and task planning preview |
๐ Proxy API Support | Custom Base URLs, compatible with various proxy services |
๐ Collaboration History | Complete record of all collaborations with search support |
๐ Quick Start
Installation
# Global install
npm install -g claude-team
# Or use directly with npx (no install needed)
npx claude-teamBasic Configuration
Add to your IDE's MCP configuration file:
IDE | Path |
Claude Code |
|
Windsurf |
|
Cursor |
|
{
"mcpServers": {
"claude-team": {
"command": "npx",
"args": ["-y", "claude-team"],
"env": {
"CLAUDE_TEAM_MAIN_KEY": "sk-your-api-key",
"CLAUDE_TEAM_MAIN_URL": "https://api.openai.com/v1",
"CLAUDE_TEAM_MAIN_MODEL": "gpt-4o",
"CLAUDE_TEAM_MAIN_PROVIDER": "openai"
}
}
}
}Start Using
> Help me build a user login feature with the team
> Have the team optimize this code for performance๐ฌ How It Works
User: "Optimize this SQL query for performance"
Tech Lead Analysis โ
โโโ Creates: SQL Optimization Expert (powerful)
โโโ Creates: Index Analysis Expert (balanced)
โโโ Workflow: sequentialUser: "Build a settings page with dark mode"
Tech Lead Analysis โ
โโโ Creates: UI Component Expert (balanced)
โโโ Creates: Theme System Expert (fast)
โโโ Creates: State Management Expert (balanced)
โโโ Workflow: parallel โ review๐ ๏ธ Available Tools
Core Tools
Tool | Description |
| ๐ Team collaboration (auto-creates experts) |
| ๐ฌ Consult an expert (supports custom experts) |
| ๐ Code review |
| ๐ Bug fixing |
Workflow Tools
Tool | Description |
| ๐ List all workflow templates |
| โถ๏ธ Execute a specific workflow |
| ๐ก Auto-recommend workflow based on task |
Pre-built Workflows:
Workflow | Purpose | Steps |
| Generate code from requirements | Design โ Implement โ Test โ Review |
| Diagnose and fix bugs | Diagnose โ Fix โ Verify |
| Code refactoring | Analyze โ Plan โ Execute โ Review |
| Multi-dimensional review | Security / Quality / Performance (parallel) |
| Generate documentation | Analyze โ Document |
Observability Tools
Tool | Description |
| ๐๏ธ View team status, experts, models, stats |
| ๐ฐ Estimate task cost (tokens, price, time) |
| ๐ง Preview task assignment plan |
| ๐ View model usage statistics |
Integration Tools
Tool | Description |
| ๐ Read project files for context |
| ๐๏ธ Analyze project structure and tech stack |
| ๐ Generate commit message from diff |
History Tools
Tool | Description |
| ๐ View collaboration history |
| ๐ Get history details |
| ๐ Search history records |
| ๐ Get recent context |
โ๏ธ Configuration
Environment Variables
Variable | Required | Description |
| โ | Main model API Key |
| โ | Main model API URL |
| โ | Main model ID (default: gpt-4o) |
| โ | Provider: |
| โ | Worker model N config (inherits from MAIN) |
| โ | Custom experts (JSON format) |
N = 1, 2, 3... supports up to 10 worker models
Custom Experts
Define your own experts beyond the built-in frontend, backend, qa:
{
"env": {
"CLAUDE_TEAM_CUSTOM_EXPERTS": "{\"rust\":{\"name\":\"Rust Expert\",\"prompt\":\"You are a Rust expert...\",\"tier\":\"powerful\"},\"k8s\":{\"name\":\"K8s Expert\",\"prompt\":\"You are a Kubernetes expert...\",\"tier\":\"balanced\"}}"
}
}Field | Required | Description |
| โ | Expert display name |
| โ | Expert role description (System Prompt) |
| โ | Model tier: |
| โ | Skill tags array |
Model Tiers
Tier | Use Case | Example Scenarios |
| Simple, quick tasks | Formatting, simple queries, docs |
| Regular dev tasks | Components, APIs, unit tests |
| Complex reasoning | Architecture, optimization, security |
๐ฆ Changelog
v0.4.0
๐ฏ Custom Experts - Define experts via environment variables
๐ Workflow Templates - 5 pre-built workflows
๐ Observability - Dashboard, cost estimation, plan preview
๐ Integration - Project file reading, structure analysis, commit messages
๐ก Smart Recommendations - Auto-suggest workflows
๐งช Test Coverage - 155 test cases
v0.3.0
๐ Task interrupt/resume support
๐ฌ Multi-turn expert conversations
๐ Token counting and cost estimation
๐ Expert templates (6 built-in + custom)
๐ Webhook notifications
โก Exponential backoff retry
๐ง Hot config reload
v0.2.x
๐ Streaming output support
๐ Usage statistics
๐ฏ Model strategies
๐พ Result caching
๐ Auto model switching
v0.1.x
๐ Initial release
๐ค Multi-model collaboration
๐ Proxy API support
๐ค Contributing
Contributions are welcome! Please read our:
๐ License
โญ Star History
Made with โค๏ธ by the community
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.