Skip to main content
Glama

Multi MCP

A flexible and dynamic Multi-MCP Proxy Server that acts as a single MCP server while connecting to and routing between multiple backend MCP servers over STDIO or SSE.

πŸš€ Features

  • βœ… Supports both STDIO and SSE transports

  • βœ… Can connect to MCP servers running in either STDIO or SSE mode

  • βœ… Proxies requests to multiple MCP servers

  • βœ… Automatically initializes capabilities (tools, prompts, resources) from connected servers

  • βœ… Dynamically add/remove MCP servers at runtime (via HTTP API)

  • βœ… Supports tools with the same name on different servers (using namespacing)

  • βœ… Deployable on Kubernetes, exposing a single port to access all connected MCP servers through the proxy

πŸ“¦ Installation

To get started with this project locally:

# Clone the repository git clone https://github.com/kfirtoledo/multi-mcp.git cd multi-mcp # Install using uv (recommended) uv venv uv pip install -r requirements.txt

πŸ–₯️ Running Locally

You can run the proxy locally in either STDIO or SSE mode depending on your needs:

1. STDIO Mode

For CLI-style operation (pipe-based communication). Used for chaining locally executed tools or agents.

uv run main.py --transport stdio

2. SSE Mode

Runs an HTTP SSE server that exposes a /sse endpoint. Useful for remote access, browser agents, and network-based tools.

uv run main.py --transport sse

Note: You can also configure the host and port using --host / --port arguments.

3. Production Mode (with External MCP Servers)

For production deployments with external MCP servers (GitHub, Brave Search, Context7), use the included startup script:

# Set required environment variables export GITHUB_PERSONAL_ACCESS_TOKEN="your-token-here" export BRAVE_API_KEY="your-api-key-here" export MULTI_MCP_API_KEY="your-secret-key" # Optional, for authentication # Run the startup script ./start-server.sh

The production configuration is stored in msc/mcp.json (git-ignored for security). This configuration includes:

  • GitHub MCP Server: Repository management, issues, pull requests

  • Brave Search MCP Server: Web search capabilities

  • Context7 MCP Server: Library documentation and code examples

All servers use environment variable interpolation for secrets (e.g., ${GITHUB_PERSONAL_ACCESS_TOKEN}).

βš™οΈ Configuration

The proxy is initialized using a JSON config (default: ./mcp.json):

{ "mcpServers": { "weather": { "command": "python", "args": ["./tools/get_weather.py"] }, "calculator": { "command": "python", "args": ["./tools/calculator.py"] } } }

This config defines the initial list of MCP-compatible servers to spawn and connect at startup.

Note: Tool names are namespaced internally as server_name::tool_name to avoid conflicts and allow multiple servers to expose tools with the same base name. For example, if an MCP server named calculator provides an add tool, it will be referenced as calculator::add.

You can also connect to a remote MCP server using SSE:

{ "mcpServers": { "weather": { "url": "http://127.0.0.1:9080/sse" } } }

More examples can be found in the examples/config/ directory.

πŸ”„ Dynamic Server Management (SSE only)

When running in SSE mode, you can add/remove/list MCP servers at runtime via HTTP endpoints:

Method

Endpoint

Description

GET

/mcp_servers

List active MCP servers

POST

/mcp_servers

Add a new MCP server

DELETE

/mcp_servers/{name}

Remove an MCP server by name

GET

/mcp_tools

Lists all available tools and their serves sources

Example to add a new server:

curl -X POST http://localhost:8080/mcp_servers \ -H "Content-Type: application/json" \ --data @add_server.json

add_server.json:

{ "mcpServers": { "unit_converter": { "command": "python", "args": ["./tools/unit_converter.py"] } } }

🐳 Docker

You can containerize and run the SSE server in K8s:

# Build the image make docker-build # Run locally with port exposure make docker-run

Kubernetes

You can deploy the proxy in a Kubernetes cluster using the provided manifests.

Run with Kind

To run the proxy locally using Kind:

kind create cluster --name multi-mcp-test kind load docker-image multi-mcp --name multi-mcp-test kubectl apply -f k8s/multi-mcp.yaml

Exposing the Proxy

The K8s manifest exposes the SSE server via a NodePort (30080 by default): You can then connect to the SSE endpoint from outside the cluster:

http://<kind-node-ip>:30080/sse

Connecting to MCP Clients

Once the proxy is running, you can connect to it using any MCP-compatible client β€” such as a LangGraph agent or custom MCP client.

For example, using the langchain_mcp_adapters client, you can integrate directly with LangGraph to access tools from one or more backend MCP servers.

See examples/connect_langgraph_client.py for a working integration example.

Make sure your environment is set up with:

  • An MCP-compatible client (e.g. LangGraph)

  • .env file containing:

MODEL_NAME=<your-model-name> BASE_URL=<https://your-openai-base-url> OPENAI_API_KEY=<your-api-key>

Inspiration

This project is inspired by and builds on ideas from two excellent open-source MCP projects:

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itstanner5216/multi-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server