Skip to main content
Glama

AI Collaboration MCP Server

by hurryupmitch

AI Collaboration MCP Server

A streamlined Model Context Protocol (MCP) server that provides enhanced AI collaboration tools for VS Code with automatic project context injection and conversation history.

🚀 Features

  • Multi-Provider Support: Claude, GPT-4, Gemini, and Ollama
  • Workspace-Specific Conversation History: Each project gets isolated conversation memory
  • Automatic Context Injection: Project files, structure, and README automatically included
  • Dynamic Workspace Management: Switch between projects seamlessly
  • API Call Management: Rate limiting (3 calls per provider per hour)
  • Streamlined Tools: Just 4 essential tools that work together

🛠️ Tools Available

1. #set_workspace

Set the current workspace directory for project-specific conversation history and context.

Usage in VS Code:

@workspace use #set_workspace with workspace_path="/path/to/your/project"

2. #consult_ai

Get expert advice from a specific AI provider with full project context.

Usage in VS Code:

@workspace use #consult_ai with claude about error handling best practices

3. #multi_ai_research

Get perspectives from multiple AI providers on complex questions.

Usage in VS Code:

@workspace use #multi_ai_research to analyze authentication approaches

4. #mandatory_execute

Force tool execution with explicit commands.

Usage in VS Code:

@workspace !consult_ai @workspace use #multi_ai_research

📦 Installation

Prerequisites

  • Node.js 18+
  • VS Code with MCP support
  • API keys for desired AI providers

1. Clone and Setup

git clone https://github.com/yourusername/ai-collaboration-mcp-server.git cd ai-collaboration-mcp-server npm install

2. Configure Environment Variables

Create a .env file:

# AI Provider API Keys (add the ones you want to use) ANTHROPIC_API_KEY=your_claude_key_here OPENAI_API_KEY=your_openai_key_here GEMINI_API_KEY=your_gemini_key_here # Ollama Configuration (for local AI) OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=llama3.2:latest

3. Build the Server

npm run build

4. Configure VS Code MCP

Option A: Workspace-specific (recommended for testing)

Create .vscode/mcp.json in your project:

{ "servers": { "ai-collaboration": { "type": "stdio", "command": "node", "args": ["/path/to/ai-collaboration-mcp-server/build/index.js"], "env": { "ANTHROPIC_API_KEY": "your_key_here", "OPENAI_API_KEY": "your_key_here", "GEMINI_API_KEY": "your_key_here", "OLLAMA_BASE_URL": "http://localhost:11434" } } } }

Option B: Global configuration (for all projects)

Create ~/.vscode/mcp.json:

{ "servers": { "ai-collaboration": { "type": "stdio", "command": "node", "args": ["/absolute/path/to/ai-collaboration-mcp-server/build/index.js"], "env": { "ANTHROPIC_API_KEY": "your_key_here", "OPENAI_API_KEY": "your_key_here", "GEMINI_API_KEY": "your_key_here", "OLLAMA_BASE_URL": "http://localhost:11434" } } } }

5. Enable MCP Auto-start (Optional)

Add to your VS Code settings.json:

{ "chat.mcp.autostart": "newAndOutdated" }

🎯 Usage

First Time Setup Per Project:

  1. Restart VS Code after configuration
  2. Open VS Code chat (sidebar or Cmd+Shift+I)
  3. Set workspace for your project:
    @workspace use #set_workspace with workspace_path="/full/path/to/your/project"

Daily Usage:

  1. Use the AI tools:
    • @workspace use #consult_ai with claude about my code
    • @workspace use #multi_ai_research to compare approaches
    • @workspace !consult_ai (force execution)

When Switching Projects:

  1. Set new workspace:
    @workspace use #set_workspace with workspace_path="/path/to/other/project"

💡 Tip: Each project gets its own .mcp-conversation-history.json file for isolated conversation memory.

⚠️ Important Syntax Note

When using @workspace in VS Code, MCP tool names must be prefixed with #:

Correct: @workspace use #consult_ai with claude about my code
Wrong: @workspace use consult_ai with claude about my code

Without @workspace, no # is needed: ✅ Also correct: use consult_ai with claude about my code

🔧 Development

Run in Development Mode

npm run dev

Test the Server

npm test

Debug with MCP Inspector

npx @modelcontextprotocol/inspector node build/index.js

🧠 How It Works

Enhanced Context Injection

Every tool call automatically includes:

  • Project structure and files
  • README and package.json content
  • Relevant conversation history
  • Current workspace context

Conversation History

  • Persistent file-based history (.mcp-conversation-history.json)
  • Smart relevance filtering
  • Cross-session context continuity

API Management

  • Rate limiting per provider (3 calls/hour)
  • Automatic retry with exponential backoff
  • Clear error handling and user feedback

🔒 Security Notes

  • API keys are stored in MCP configuration (keep them secure)
  • Conversation history is stored locally
  • No data sent to external services except AI provider APIs

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test thoroughly
  5. Submit a pull request

📄 License

MIT License - see LICENSE file for details

🆘 Troubleshooting

MCP Server Won't Start

  1. Check Cmd+Shift+P → "MCP: List Servers"
  2. Verify file paths in configuration
  3. Check VS Code Output panel for errors
  4. Ensure Node.js and dependencies are installed

API Keys Not Working

  1. Verify keys are correctly set in MCP configuration
  2. Check for typos or extra spaces
  3. Ensure keys have proper permissions

Tools Not Appearing

  1. Restart VS Code completely
  2. Try @workspace in chat to trigger MCP loading
  3. Check MCP server logs for errors

🌟 Why This Approach?

This streamlined server demonstrates that smart consolidation beats feature proliferation:

  • 3 core tools instead of 7+ specialized ones
  • Enhanced context shared across all tools
  • Easier maintenance and debugging
  • Better user experience with consistent functionality
  • Reduced cognitive load - focus on what you want, not which tool to use

Perfect for teams wanting powerful AI collaboration without complexity!

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables collaboration with multiple AI providers (Claude, GPT-4, Gemini, Ollama) directly from VS Code with automatic project context injection and persistent conversation history. Provides streamlined tools for getting AI advice, multi-provider research, and enhanced context sharing across sessions.

  1. 🚀 Features
    1. 🛠️ Tools Available
      1. 1. #set_workspace
      2. 2. #consult_ai
      3. 3. #multi_ai_research
      4. 4. #mandatory_execute
    2. 📦 Installation
      1. Prerequisites
      2. 1. Clone and Setup
      3. 2. Configure Environment Variables
      4. 3. Build the Server
      5. 4. Configure VS Code MCP
      6. 5. Enable MCP Auto-start (Optional)
    3. 🎯 Usage
      1. First Time Setup Per Project:
      2. Daily Usage:
      3. When Switching Projects:
    4. ⚠️ Important Syntax Note
      1. 🔧 Development
        1. Run in Development Mode
        2. Test the Server
        3. Debug with MCP Inspector
      2. 🧠 How It Works
        1. Enhanced Context Injection
        2. Conversation History
        3. API Management
      3. 🔒 Security Notes
        1. 🤝 Contributing
          1. 📄 License
            1. 🆘 Troubleshooting
              1. MCP Server Won't Start
              2. API Keys Not Working
              3. Tools Not Appearing
            2. 🌟 Why This Approach?

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/hurryupmitch/ai-collaboration-mcp-server'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server