Skip to main content
Glama
atsuki-sakai

Claude Code AI Collaboration MCP Server

by atsuki-sakai

Claude Code AI Collaboration MCP Server

A powerful Model Context Protocol (MCP) server that enables AI collaboration through multiple providers with advanced strategies and comprehensive tooling.

Build Status TypeScript License: MIT Node.js

๐ŸŒŸ Features

๐Ÿค– Multi-Provider AI Integration

  • DeepSeek: Primary provider with optimized performance

  • OpenAI: GPT models integration

  • Anthropic: Claude models support

  • O3: Next-generation model support

๐Ÿš€ Advanced Collaboration Strategies

  • Parallel: Execute requests across multiple providers simultaneously

  • Sequential: Chain provider responses for iterative improvement

  • Consensus: Build agreement through multiple provider opinions

  • Iterative: Refine responses through multiple rounds

๐Ÿ› ๏ธ Comprehensive MCP Tools

  • collaborate: Multi-provider collaboration with strategy selection

  • review: Content analysis and quality assessment

  • compare: Side-by-side comparison of multiple items

  • refine: Iterative content improvement

๐Ÿ“Š Enterprise Features

  • Caching: Memory and Redis-compatible caching system

  • Metrics: OpenTelemetry-compatible performance monitoring

  • Search: Full-text search with inverted indexing

  • Synthesis: Intelligent response aggregation

๐Ÿš€ Quick Start

๐Ÿ“– New to MCP? Check out our Quick Start Guide for a 5-minute setup!

Prerequisites

  • Node.js 18.0.0 or higher

  • pnpm 8.0.0 or higher

  • TypeScript 5.3.0 or higher

Installation

# Clone the repository git clone https://github.com/atsuki-sakai/ai_collaboration_mcp_server.git cd ai_collaboration_mcp_server # Install dependencies pnpm install # Build the project pnpm run build # Run tests pnpm test

Configuration

  1. Environment Variables:

    # Required: Set your API keys export DEEPSEEK_API_KEY="your-deepseek-api-key" export OPENAI_API_KEY="your-openai-api-key" export ANTHROPIC_API_KEY="your-anthropic-api-key" # Optional: Configure other settings export MCP_DEFAULT_PROVIDER="deepseek" export MCP_PROTOCOL="stdio"
  2. Configuration Files:

    • config/default.yaml: Default configuration

    • config/development.yaml: Development settings

    • config/production.yaml: Production settings

Running the Server

# Start with default settings pnpm start # Start with specific protocol node dist/index.js --protocol stdio # Start with custom providers node dist/index.js --providers deepseek,openai --default-provider deepseek # Enable debug mode NODE_ENV=development LOG_LEVEL=debug pnpm start

๐Ÿ”— Claude Code Integration

Connecting to Claude Code

To use this MCP server with Claude Code, you need to configure Claude Code to recognize and connect to your server.

Use the automated setup script for easy configuration:

# Navigate to your project directory cd /Users/atsukisakai/Desktop/ai_collaboration_mcp_server # Run automated setup with your DeepSeek API key ./scripts/setup-claude-code.sh --api-key "your-deepseek-api-key" # Or with multiple providers ./scripts/setup-claude-code.sh \ --api-key "your-deepseek-key" \ --openai-key "your-openai-key" \ --anthropic-key "your-anthropic-key" # Alternative using pnpm pnpm run setup:claude-code -- --api-key "your-deepseek-key"

The setup script will:

  • โœ… Build the MCP server

  • โœ… Create Claude Code configuration file

  • โœ… Test the server connection

  • โœ… Provide next steps

1b. Manual Setup

If you prefer manual setup:

# Navigate to your project directory cd /Users/atsukisakai/Desktop/ai_collaboration_mcp_server # Install dependencies and build pnpm install pnpm run build # Set your DeepSeek API key export DEEPSEEK_API_KEY="your-deepseek-api-key" # Test the server pnpm run verify-deepseek

2. Configure Claude Code

Create or update the Claude Code configuration file:

Note: There are two server options:

  • simple-server.js - Simple implementation with DeepSeek only (recommended for testing)

  • index.js - Full implementation with all providers and features

macOS/Linux:

# Create config directory if it doesn't exist mkdir -p ~/.config/claude-code # Create configuration file (simple server - recommended for testing) cat > ~/.config/claude-code/claude_desktop_config.json << 'EOF' { "mcpServers": { "ai-collaboration": { "command": "node", "args": ["/Users/atsukisakai/Desktop/ai_collaboration_mcp_server/dist/simple-server.js"], "env": { "DEEPSEEK_API_KEY": "your-deepseek-api-key" } } } } EOF # Or use the full server for all features # Replace simple-server.js with index.js in the args above

Windows:

# Create config directory mkdir "%APPDATA%\Claude" # Create configuration file (use your preferred text editor) # File: %APPDATA%\Claude\claude_desktop_config.json

3. Configuration Options

{ "mcpServers": { "ai-collaboration": { "command": "node", "args": [ "/Users/atsukisakai/Desktop/ai_collaboration_mcp_server/dist/index.js", "--default-provider", "deepseek", "--providers", "deepseek,openai" ], "env": { "DEEPSEEK_API_KEY": "your-deepseek-api-key", "OPENAI_API_KEY": "your-openai-api-key", "ANTHROPIC_API_KEY": "your-anthropic-api-key", "NODE_ENV": "production", "LOG_LEVEL": "info", "MCP_DISABLE_CACHING": "false", "MCP_DISABLE_METRICS": "false" } } } }

4. Available Tools in Claude Code

After restarting Claude Code, you'll have access to these powerful tools:

  • ๐Ÿค collaborate - Multi-provider AI collaboration

  • ๐Ÿ“ review - Content analysis and quality assessment

  • โš–๏ธ compare - Side-by-side comparison of multiple items

  • โœจ refine - Iterative content improvement

5. Usage Examples in Claude Code

# Use DeepSeek for code explanation Please use the collaborate tool to explain this Python code with DeepSeek # Review code quality Use the review tool to analyze the quality of this code # Compare multiple solutions Use the compare tool to compare these 3 approaches to solving this problem # Improve code iteratively Use the refine tool to make this function more efficient

6. Troubleshooting

Check MCP server connectivity:

# Test if the server starts correctly DEEPSEEK_API_KEY="your-key" node dist/index.js --help

View logs:

# Check application logs tail -f logs/application-$(date +%Y-%m-%d).log

Verify Claude Code configuration:

  1. Restart Claude Code completely

  2. In a new conversation, ask "What tools are available?"

  3. You should see the four MCP tools listed

  4. Test with a simple command like "Use collaborate to say hello"

7. Configuration File Locations

  • macOS: ~/.config/claude-code/claude_desktop_config.json

  • Windows: %APPDATA%\Claude\claude_desktop_config.json

  • Linux: ~/.config/claude-code/claude_desktop_config.json

๐Ÿ“– Usage

MCP Tools

Collaborate Tool

Execute multi-provider collaboration with strategy selection:

{ "jsonrpc": "2.0", "id": 1, "method": "tools/call", "params": { "name": "collaborate", "arguments": { "prompt": "Explain quantum computing in simple terms", "strategy": "consensus", "providers": ["deepseek", "openai"], "config": { "timeout": 30000, "consensus_threshold": 0.7 } } } }

Review Tool

Analyze content quality and provide detailed feedback:

{ "jsonrpc": "2.0", "id": 2, "method": "tools/call", "params": { "name": "review", "arguments": { "content": "Your content here...", "criteria": ["accuracy", "clarity", "completeness"], "review_type": "comprehensive" } } }

Compare Tool

Compare multiple items with detailed analysis:

{ "jsonrpc": "2.0", "id": 3, "method": "tools/call", "params": { "name": "compare", "arguments": { "items": [ {"id": "1", "content": "Option A"}, {"id": "2", "content": "Option B"} ], "comparison_dimensions": ["quality", "relevance", "innovation"] } } }

Refine Tool

Iteratively improve content quality:

{ "jsonrpc": "2.0", "id": 4, "method": "tools/call", "params": { "name": "refine", "arguments": { "content": "Content to improve...", "refinement_goals": { "primary_goal": "clarity", "target_audience": "general public" } } } }

Available Resources

  • collaboration_history: Access past collaboration results

  • provider_stats: Monitor provider performance metrics

  • tool_usage: Track tool utilization statistics

๐Ÿ—๏ธ Architecture

Core Components

src/ โ”œโ”€โ”€ core/ # Core framework components โ”‚ โ”œโ”€โ”€ types.ts # Dependency injection symbols โ”‚ โ”œโ”€โ”€ logger.ts # Structured logging โ”‚ โ”œโ”€โ”€ config.ts # Configuration management โ”‚ โ”œโ”€โ”€ container.ts # DI container setup โ”‚ โ”œโ”€โ”€ provider-manager.ts # AI provider orchestration โ”‚ โ”œโ”€โ”€ strategy-manager.ts # Execution strategy management โ”‚ โ””โ”€โ”€ tool-manager.ts # MCP tool management โ”œโ”€โ”€ providers/ # AI provider implementations โ”‚ โ”œโ”€โ”€ base-provider.ts # Common provider functionality โ”‚ โ”œโ”€โ”€ deepseek-provider.ts โ”‚ โ”œโ”€โ”€ openai-provider.ts โ”‚ โ”œโ”€โ”€ anthropic-provider.ts โ”‚ โ””โ”€โ”€ o3-provider.ts โ”œโ”€โ”€ strategies/ # Collaboration strategies โ”‚ โ”œโ”€โ”€ parallel-strategy.ts โ”‚ โ”œโ”€โ”€ sequential-strategy.ts โ”‚ โ”œโ”€โ”€ consensus-strategy.ts โ”‚ โ””โ”€โ”€ iterative-strategy.ts โ”œโ”€โ”€ tools/ # MCP tool implementations โ”‚ โ”œโ”€โ”€ collaborate-tool.ts โ”‚ โ”œโ”€โ”€ review-tool.ts โ”‚ โ”œโ”€โ”€ compare-tool.ts โ”‚ โ””โ”€โ”€ refine-tool.ts โ”œโ”€โ”€ services/ # Enterprise services โ”‚ โ”œโ”€โ”€ cache-service.ts โ”‚ โ”œโ”€โ”€ metrics-service.ts โ”‚ โ”œโ”€โ”€ search-service.ts โ”‚ โ””โ”€โ”€ synthesis-service.ts โ”œโ”€โ”€ server/ # MCP server implementation โ”‚ โ””โ”€โ”€ mcp-server.ts โ””โ”€โ”€ types/ # Type definitions โ”œโ”€โ”€ common.ts โ”œโ”€โ”€ interfaces.ts โ””โ”€โ”€ index.ts

Design Principles

  • Dependency Injection: Clean architecture with InversifyJS

  • Strategy Pattern: Pluggable collaboration strategies

  • Provider Abstraction: Unified interface for different AI services

  • Performance: Efficient caching and rate limiting

  • Observability: Comprehensive metrics and logging

  • Extensibility: Easy to add new providers and strategies

๐Ÿ”ง Configuration

Configuration Schema

The server uses YAML configuration files with JSON Schema validation. See config/schema.json for the complete schema.

Key Configuration Sections

  • Server: Basic server settings (name, version, protocol)

  • Providers: AI provider configurations and credentials

  • Strategies: Strategy-specific settings and timeouts

  • Cache: Caching behavior (memory, Redis, file)

  • Metrics: Performance monitoring settings

  • Logging: Log levels and output configuration

Environment Variables

Variable

Description

Default

DEEPSEEK_API_KEY

DeepSeek API key

Required

OPENAI_API_KEY

OpenAI API key

Optional

ANTHROPIC_API_KEY

Anthropic API key

Optional

O3_API_KEY

O3 API key (defaults to OPENAI_API_KEY)

Optional

MCP_PROTOCOL

Transport protocol

stdio

MCP_DEFAULT_PROVIDER

Default AI provider

deepseek

NODE_ENV

Environment mode

production

LOG_LEVEL

Logging level

info

๐Ÿ“Š Monitoring & Metrics

Built-in Metrics

  • Request Metrics: Response times, success rates, error counts

  • Provider Metrics: Individual provider performance

  • Tool Metrics: Usage statistics per MCP tool

  • Cache Metrics: Hit rates, memory usage

  • System Metrics: CPU, memory, and resource utilization

OpenTelemetry Integration

The server supports OpenTelemetry for distributed tracing and metrics collection:

metrics: enabled: true export: enabled: true format: "opentelemetry" endpoint: "http://localhost:4317"

๐Ÿงช Testing

Test Coverage

  • Unit Tests: 95+ individual component tests

  • Integration Tests: End-to-end MCP protocol testing

  • E2E Tests: Complete workflow validation

  • API Tests: Direct provider API validation

Running Tests

# Run all tests pnpm test # Run with coverage pnpm run test:coverage # Run specific test suites pnpm run test:unit pnpm run test:integration pnpm run test:e2e # Verify API connectivity pnpm run verify-deepseek

๐Ÿšข Deployment

Docker

# Build image docker build -t claude-code-ai-collab-mcp . # Run container docker run -d \ -e DEEPSEEK_API_KEY=your-key \ -p 3000:3000 \ claude-code-ai-collab-mcp

Production Considerations

  • Load Balancing: Multiple server instances for high availability

  • Caching: Redis for distributed caching

  • Monitoring: Prometheus/Grafana for metrics visualization

  • Security: API key rotation and rate limiting

  • Backup: Regular configuration and data backups

๐Ÿค Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

Development Setup

# Fork and clone the repository git clone https://github.com/atsuki-sakai/ai_collaboration_mcp_server.git cd ai_collaboration_mcp_server # Install dependencies pnpm install # Start development pnpm run dev # Run tests pnpm test # Lint and format pnpm run lint pnpm run lint:fix

๐Ÿ“‹ Roadmap

Version 1.1

  • GraphQL API support

  • WebSocket transport protocol

  • Advanced caching strategies

  • Custom strategy plugins

Version 1.2

  • Multi-tenant support

  • Enhanced security features

  • Performance optimizations

  • Additional AI providers

Version 2.0

  • Distributed architecture

  • Advanced workflow orchestration

  • Machine learning optimization

  • Enterprise SSO integration

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ†˜ Support

๐Ÿ™ Acknowledgments


Built with โค๏ธ by the Claude Code AI Collaboration Team# think_hub

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/atsuki-sakai/ai_collaboration_mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server