Allows JetBrains IDEs to connect to the messaging relay, facilitating cross-IDE collaboration and tool coordination.
Enables LangChain agents to communicate with developer IDEs, supporting complex AI workflows and multi-agent coordination.
Allows Next.js web applications to integrate with the bridge via a proxy pattern, enabling web-based tools to interact directly with IDE clients.
Enables PyCharm to function as a client in the bridge, allowing it to send and receive messages with other connected development environments.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP IDE Bridgesend this function to teammate_vscode for code review"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP IDE Bridge
š¬ Demo Video
A stateless, open source MCP (Model Context Protocol) HTTP Streamable server that enables client-to-client communication between IDEs and development tools. This opens up a new dimension of collaboration beyond traditional MCP client-server interactions.
š Perfect for: Cross-IDE collaboration, team development workflows, AI agent coordination, and seamless tool integration.
š What Makes This Special?
Traditional MCP vs MCP IDE Bridge
Traditional MCP | MCP IDE Bridge |
Client ā Server | Client ā Server ā Client |
One-way communication | Bidirectional messaging |
Tool execution only | Real-time collaboration |
Single IDE focus | Multi-IDE coordination |
Real-World Use Cases
šÆ IDE Collaboration
Cursor ā Cursor: Share code snippets, debugging sessions, or pair programming
Cursor ā VS Code: Cross-editor communication and file sharing
Windsurf ā Any IDE: AI agent coordination across different development environments
Team Workflows: Coordinate multiple developers working on the same project
š¤ AI Agent Coordination
Agent-to-agent communication for complex workflows
Distributed AI processing across multiple tools
Human-in-the-loop collaboration with AI assistants
šļø Architecture
Client-to-Client Communication
Key Components
Message Relay: Stateless server that routes messages between clients
Client Registry: Dynamic client discovery and registration
Message Queues: Per-recipient queues with automatic expiration
HTTP Streamable: Latest MCP transport for real-time communication
š Quick Start
1. Start the Server
Docker (Recommended):
Default Configuration:
Port: 8111 (both external and internal)
Host: 0.0.0.0 (accepts connections from any interface)
Transport: HTTP Streamable (MCP latest)
Health Check: Built-in endpoint monitoring
Python (Development Setup):
2. Configure Your IDE
Create mcp_recipients.json in your project root. Each project gets ONE file with its own unique ID and list of recipients it can communicate with:
š¤ AI Agent Generation: Your IDE's AI agent can generate this file! Simply ask:
Cursor: "Generate an mcp_recipients.json for my project"
VS Code: "Create mcp_recipients.json configuration for my team"
Windsurf: "Help me set up mcp_recipients.json for collaboration"
š Multi-Project Examples: See examples/multi-project-setup/ for examples showing how different projects communicate. Each project file must be named - the filename examples in that folder are just for reference.
3. Connect Your IDE
Cursor IDE:
Create
.cursor/mcp.json:
Open Command Palette (
Cmd/Ctrl + Shift + P)Search for "MCP: Connect to Server"
Enter:
http://localhost:8111/mcp/
VS Code:
Install MCP extension from marketplace
Create
mcp_recipients.jsonin project rootConfigure MCP settings in VS Code preferences
Use MCP commands to connect and collaborate
Windsurf:
Create
mcp_recipients.jsonin project rootOpen Windsurf settings ā MCP configuration
Add server URL:
http://localhost:8111/mcp/Start messaging with other IDEs
Claude Desktop:
Create
mcp_recipients.jsonin project rootOpen Claude Desktop settings ā MCP configuration
Add server URL:
http://localhost:8111/mcp/Use Claude's MCP integration to communicate
JetBrains IDEs (IntelliJ, PyCharm, etc.):
Install MCP plugin from plugin marketplace
Create
mcp_recipients.jsonin project rootConfigure MCP server in plugin settings
Use MCP tools from the IDE
Note: Each IDE requires both mcp_recipients.json (for messaging) and IDE-specific MCP configuration (for connection). Each project gets ONE The file must be named exactly mcp_recipients.json and placed in the project root for easy discovery by IDE agents. See examples/multi-project-setup/README.md for detailed setup instructions.
š Non-IDE Clients (LangChain, mcp-use, Custom Apps)
Overview
Non-IDE clients use the exact same MCP protocol as IDE clients. The only difference is how they provide their configuration:
IDE clients: Read
mcp_recipients.jsonfrom local file systemNon-IDE clients: Provide
recipients_configas parameter to MCP tools
No registration, no REST endpoints, no special setup - just parameter injection!
This enables seamless integration with frameworks like LangChain, mcp-use, custom Python scripts, and web applications.
Architecture
Setup - Client Wrapper Approach
Create a wrapper that automatically injects your configuration:
LangChain Integration:
mcp-use Integration:
Real-World Implementation: Proxy Pattern
For production web applications, the recommended approach is a proxy/interceptor pattern that selectively handles messaging tools:
Next.js API Route Example (dyson_frontend implementation):
Setup Steps for Non-IDE Clients:
Create MCP proxy endpoint (
/api/mcp-proxyor equivalent)Hardcode your recipient configuration (no
mcp_recipients.jsonfiles needed)Intercept only messaging tools:
send_message_without_waiting,get_messages,get_my_identity,checkin_clientInject required parameters where missing (sender_id, client_id, etc.)
Override to return your config as markdown
Forward everything else unchanged (conservative approach)
Framework Examples:
Benefits
š Simple Integration: Same protocol as IDE clients
š” No Special Setup: Just parameter injection
š Client-Side Control: Proxy manages configuration
š ļø Framework Agnostic: Works with any MCP client library
šļø Conservative Approach: Only intercepts what's needed (99% traffic unchanged)
š¾ No File Dependencies: Runtime configuration, no mcp_recipients.json required
š§ Production Ready: Real-world pattern used by active projects
š Available Tools
Core Messaging Tools
Tool | Description | Use Case |
| Register your presence | Announce availability |
| Fire & forget messaging | ONLY messaging method |
| š¬ ESSENTIAL - Check for replies | Required after messaging |
| Get configuration help | Setup assistance |
| View active connections | Monitor team activity |
š Messaging Workflow
MESSAGING PATTERN: Fire-and-forget + get_messages for efficient communication:
1. Send Messages (Fire & Forget):
2. Check for Replies:
Message Patterns:
Benefits:
ā No Blocking: Instant return, no waits
ā Scalable: Works for one or more recipients efficiently
ā Fast: No timeouts or blocking calls
ā Better UX: Smooth, responsive messaging experience
Example Workflows
Team Collaboration
AI Agent Coordination
š Security Considerations
Current State (Desktop Use)
ā Suitable for:
Local development teams
Personal projects
Desktop-only workflows
Trusted network environments
ā ļø Limitations:
No authentication beyond client IDs
No encryption of messages
No access control
No audit logging
š Security Model:
Client IDs act as simple credentials
Messages stored in memory only
5-minute automatic expiration
No persistent storage
Enterprise Solution
For production use, security, and team collaboration, we offer MilesDyson.ai - an enterprise-grade Agentic Platform as a Service (aPaaS) that addresses all security concerns:
š Enterprise Authentication: SSO, RBAC, and audit trails
š”ļø End-to-End Encryption: All messages encrypted in transit and at rest
š Global Infrastructure: Multi-region deployment with 99.9% uptime
š„ Team Management: User management, permissions, and collaboration tools
š Analytics: Usage insights and performance monitoring
š§ Enterprise Support: Dedicated support and custom integrations
š§Ŗ Testing
MCP Test Harness (Recommended)
NEW! We've included a comprehensive MCP test harness (test_mcp_client.py) that makes testing all MCP tools easy and reliable:
Features:
ā Proper MCP Headers: Handles
text/event-streamand streaming responses correctlyā Beautiful Output: Clean markdown display with raw JSON debugging
ā All Tools Supported: Test every MCP tool with proper argument handling
ā Flexible Arguments: Use individual flags or JSON for complex parameters
ā Error Handling: Clear error messages and troubleshooting info
Installation:
Quick Connection Test
Reference Client
The project includes a reference MCP client for testing:
šļø Development
Project Structure
Note: Each project gets ONE mcp_recipients.json file with its own unique ID and recipient list. The example filenames in multi-project-setup/ are just for reference - your actual file must be named mcp_recipients.json in each project root.
Local Development
ā ļø Important: The pip install -e . step is required for Python to properly find the mcp_messaging module. Without this, you'll get ModuleNotFoundError: No module named 'mcp_messaging'.
š¤ Contributing
We welcome contributions! Please see CONTRIBUTING.md for:
Development setup
Code style guidelines
Testing procedures
Pull request process
š License
MIT License - see LICENSE for details.
š Enterprise Solution
Ready for production use?
MilesDyson.ai provides enterprise-grade MCP IDE Bridge with:
š Enterprise Security: SSO, encryption, audit trails
š Global Infrastructure: Multi-region, high availability
š„ Team Management: User management and collaboration tools
š Analytics & Monitoring: Usage insights and performance tracking
š§ Enterprise Support: Dedicated support and custom integrations
Perfect for:
Development teams
Enterprise environments
Production deployments
Multi-organization collaboration
Built with MCP HTTP Streamable transport ⢠Powered by FastMCP ⢠Made with ā¤ļø by MVP2o.ai
Contributing via Pull Requests
We welcome contributions! To submit changes:
Fork this repository and clone your fork.
Create a new feature branch from your fork's main branch:
git checkout -b feature/your-feature-nameMake your changes and commit them to your feature branch.
Push your branch to your fork:
git push --set-upstream origin feature/your-feature-nameOpen a pull request from your fork/branch to the
mainbranch of the upstream repository (Mvp2o-ai/mcp-ide-bridge).Wait for review and feedback from the maintainers.
See CONTRIBUTING.md for more details.