Enables testing of the meta-dynamic MCP server through direct HTTP requests, allowing verification of connectivity and message exchange with the server.
Provides the runtime environment for the MCP server, with support for Node.js v16 and above.
Used as the implementation language for the server, providing type safety and modern JavaScript features.
Meta-Dynamic MCP Server
A single Model Context Protocol (MCP) proxy that aggregates multiple remote MCP endpoints (via HTTP-stream or SSE) and exposes them through one unified SSE interface.
Ideal for driving a single LLM client (e.g. Claude) while mixing in any number of specialized MCP servers (math, finance, etc.).
🔄 Why Meta-Dynamic vs Direct MCP Configuration
Traditionally, you would list each MCP server directly in your LLM client’s mcpServers
config. While straightforward, that approach has drawbacks:
- Tight coupling: Every time you add or remove an MCP endpoint, you must update the client config and restart the LLM process.
- Multiple connections: The client has to manage separate HTTP/SSE transports for each server, increasing complexity.
- No shared logic: Common patterns like namespacing, error handling, or retries must be re-implemented in every client.
Meta-Dynamic centralizes these concerns in one proxy:
- Single endpoint: Your LLM client only talks to
http://localhost:8080/sse
, regardless of how many backends you add. - Dynamic remotes: Remotes are configured in one place (your proxy), decoupled from the LLM—add/remove without touching the client.
- Unified logic: Namespacing, tool/resource aggregation, error handling, and transport selection live in a single codebase, reducing duplication.
🔧 Prerequisites
- Node.js ≥ v16
- npm (or Yarn)
- A set of running MCP servers you want to proxy (e.g. FastMCP math server on
http://localhost:8083/mcp
, CoinGecko’s SSE-based MCP, etc.)
🏗️ Project Structure
🚀 Installation & Development
- Clone & install
- Run in watch mode
- Build & run
⚙️ Configuration: Adding Remotes
Edit src/index.ts
to define the list of MCP servers you wish to proxy.
Each remote needs:
- name: unique alias (used to namespace URIs & tool names)
- url: full endpoint URL (HTTP-stream endpoints point to
/mcp
, SSE to the/sse
path) - transport: either
httpStream
orsse
Note: The proxy exposes an SSE stream on port 8080 by default:
http://localhost:8080/sse
📜 How It Works
- Remote Initialization: connects to each MCP server using the specified transport.
- Request Handlers:
- resources/list, resources/read → fan-out & namespace by alias
- tools/list, tools/call → aggregate & route tool invocations
- SSE Endpoint: exposes a single SSE stream (
/sse
) and message POST path (/messages
) for any MCP-capable LLM client.
🧪 Testing
You can verify connectivity with curl
or your LLM’s built-in MCP client.
Example with curl
to list resources:
🚧 Contributing
- Fork the repo
- Create a feature branch
- Submit a PR with tests/documentation
📄 License
Released under the MIT License. See LICENSE for details.
This server cannot be installed
Aggregates multiple remote Model Context Protocol endpoints and exposes them through a unified SSE interface, allowing an LLM client to interact with specialized servers without configuration changes.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that enables LLMs to interact with Salesforce data through SOQL queries, SOSL searches, and various API operations including record management.Last updated -1077PythonMIT License
- -securityAlicense-qualityA comprehensive toolkit that enhances LLM capabilities through the Model Context Protocol, allowing LLMs to interact with external services including command-line operations, file management, Figma integration, and audio processing.Last updated -20PythonApache 2.0
- AsecurityAlicenseAqualityA Model Context Protocol server that loads multiple OpenAPI specifications and exposes them to LLM-powered IDE integrations, enabling AI to understand and work with your APIs directly in development tools like Cursor.Last updated -72927TypeScriptMIT License
- -securityFlicense-qualityA unified Model Context Protocol Gateway that bridges LLM interfaces with various tools and services, providing OpenAI API compatibility and supporting both synchronous and asynchronous tool execution.Last updated -Python