Wraps the GitHub MCP server to provide token-efficient access to repository management, issue tracking, and other GitHub functionalities via a lazy-loading meta-interface.
Provides sandboxed execution of JavaScript code with access to MCP tools through typed wrappers, allowing AI to execute scripts within a secure environment.
Enables interaction with Jira instances by wrapping Jira MCP servers, allowing AI agents to search for, create, and manage issues.
Provides sandboxed execution of TypeScript code with access to MCP tools through typed wrappers, allowing AI to execute complex scripts with full type support.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Meta-MCP ServerList the available backend servers and their tool summaries"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Meta-MCP Server

A Model Context Protocol (MCP) server that wraps multiple backend MCP servers for token-efficient tool discovery via lazy loading.
Monorepo Structure
This project is organized as a monorepo with the following packages:
Package | Description | Install |
Core utilities: types, connection pool, registry, tool cache |
| |
MCP server exposing 3 meta-tools for token optimization |
| |
Sandboxed code execution with MCP tool access via typed wrappers |
|
Problem
When Claude/Droid connects to many MCP servers, it loads all tool schemas upfront - potentially 100+ tools consuming significant context tokens before any work begins.
Solution
Meta-MCP exposes only 3 tools to the AI:
Tool | Purpose |
| List available backend servers (lightweight, no schemas) |
| Fetch tools from a server. Supports |
| Execute a tool on a backend server |
Backend servers are spawned lazily on first access and managed via a connection pool.
Features
Lazy Loading: Servers spawn only when first accessed
Two-Tier Tool Discovery: Fetch summaries first (~100 tokens), then specific schemas on-demand
Connection Pool: LRU eviction (max 20 connections) with idle cleanup (5 min)
Multi-Transport: Supports Node, Docker, and uvx/npx spawn types
Tool Caching: Tool definitions cached per-server for session duration
VS Code Extension: Visual UI for managing servers and configuring AI tools
Sandboxed Execution: Execute code in isolated environments with MCP tool access
Quick Start
Option 1: VS Code/Cursor Extension (Recommended)
The Meta-MCP extension provides a visual interface for configuration:
Install the extension from
extension/meta-mcp-configurator-0.1.2.vsixOpen the Meta-MCP panel - click the Meta-MCP icon in the activity bar (left sidebar)
Go to the Setup tab and complete the setup wizard:
Step 1: Install meta-mcp-server
Click Install via npm (opens terminal with
npm install -g @justanothermldude/meta-mcp-server)Or run manually:
npm install -g @justanothermldude/meta-mcp-server
Step 1b: Install mcp-exec (Optional)
Click Install next to mcp-exec for sandboxed code execution with MCP tool access
Or run manually:
npm install -g @justanothermldude/mcp-exec
mcp-exec enables AI to execute TypeScript/JavaScript code with typed wrappers for your MCP servers.
Step 2: Configure Your AI Tools
The extension auto-detects installed AI tools and shows their status:
Tool | Config Location | Detection |
Claude |
|
|
Cursor |
|
|
Droid (Factory) |
|
|
VS Code |
|
|
For each detected tool, use these buttons:
Button | Action |
Configure | Auto-configures the tool: adds meta-mcp and mcp-exec (if installed globally), migrates existing servers to |
Copy Snippet | Copies JSON config to clipboard for manual setup |
The Configure button intelligently:
Detects which packages are installed (
npm list -g)Adds only installed packages to the tool config
Migrates any existing MCP servers to
~/.meta-mcp/servers.jsonShows migration count in success message
Other Platforms (Windsurf, Augment, etc.)
For tools not auto-detected, copy and adapt this snippet:
Restart your AI tool to load the new configuration
Add servers from the Catalog tab or Servers tab manually
Option 2: npm Package
Then add to your AI tool config (see Configuration below).
Option 3: Build from Source
Configuration
servers.json
All MCP servers are configured in ~/.meta-mcp/servers.json:
Note: The optional
timeoutfield sets per-server timeout in milliseconds. This overridesMCP_DEFAULT_TIMEOUT.
Internal MCP Servers
For internal/corporate MCP servers (like corp-jira), the extension handles setup automatically:
Click Add on an Internal server in the Catalog
If not found locally, choose Clone Repository - the extension opens a terminal and runs:
git clone https://github.com/Adobe-AIFoundations/adobe-mcp-servers.git cd adobe-mcp-servers && npm install && npm run buildOnce built, click Add again - the server will be auto-detected via Spotlight (macOS)
Manual setup (if needed):
AI Tool Configuration
Add meta-mcp to your AI tool's config file:
Claude (~/.claude.json):
Droid (~/.factory/mcp.json):
Using local build (instead of npx):
Restart your AI tool
Restart Claude or Droid to load the new configuration.
Usage
Once configured, the AI will see only 3 tools instead of all backend tools:
Two-Tier Lazy Loading
See Token Economics for detailed analysis of 87-91% token savings across different workflow patterns.
Development
Monorepo Commands
Package-Specific Development
Testing
Architecture
For detailed architecture documentation with diagrams, see:
Architecture Guide - Complete narrative guide with all concepts explained
Diagram Index - Visual diagrams organized by topic
Core Mechanics - Pool, connections, caching, tool system
Token Economics - 87-91% savings, ROI analysis
Monorepo Package Structure
Configuration Options
Environment Variable | Default | Description |
|
| Path to backends configuration |
|
| Maximum concurrent server connections |
|
| Idle connection cleanup timeout (5 min) |
| none | Global timeout for MCP tool calls (ms). Per-server |
Test Results
341 tests passing (unit + integration across all packages)
48 integration tests skipped by default (require
RUN_REAL_MCP_TESTS=true)Tested with Node, Docker, and uvx/npx spawn types