Skip to main content
Glama

1 MCP Server ๐Ÿš€

MCP of MCPs โ€” automatically discover and configure MCP servers on your machine (remote or local).

After setup, you can usually just say:

โ€œI want to perform . Call the deep_search tool and follow the outlined steps.โ€

The goal is that you only install this MCP server, and it handles the rest (searching servers, selecting servers, configuring servers, etc.).

Demo video ๐ŸŽฅ: https://youtu.be/W4EAmaTTb2A

Quick Setup

Choose one of the following:

  1. Remote (simplest & fastest โšก๐Ÿ’จ)

  2. Local (prebuilt) โ€” Docker, uvx, or npx

  3. Local (from source) โ€” run this repo directly

1) Remote ๐ŸŒโšก๐Ÿ’จ

Use the hosted endpoint (recommended for the simplest setup).

Docs + guided setup: https://mcp.1mcpserver.com/

Configure your MCP client

Add the following entry to your client config file:

  • Cursor: ./.cursor/mcp.json

  • Gemini CLI: ./gemini/settings.json (see Gemini docs)

  • Claude Desktop:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

    • Windows: %APPDATA%\Claude\claude_desktop_config.json

  • Codex:

    • macOS: ~/.codex/config.toml

    • Windows: %USERPROFILE%\.codex\config.toml

Remote config (JSON):

{ "mcpServers": { "1mcpserver": { "url": "https://mcp.1mcpserver.com/mcp/", "headers": { "Accept": "text/event-stream", "Cache-Control": "no-cache" } } } }

If you already have other servers configured, just merge this entry under mcpServers For example:

{ "mcpServers": { "1mcpserver": { "url": "https://mcp.1mcpserver.com/mcp/", "headers": { "Accept": "text/event-stream", "Cache-Control": "no-cache" } }, "file-system": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "."] } } }

Tip: If your client supports it, move the config file into your home directory to apply globally.


2) Local (prebuilt) ๐Ÿ’ป

Use this when you want everything local, or when your MCP client only supports STDIO.

2A) Docker ๐Ÿณ

Use this if you want an isolated runtime and a single, reproducible command.

docker run --rm -i \ -e DATADIR=/data \ -v "$PWD/db:/data" \ <YOUR_DOCKER_IMAGE_HERE>
{ "mcpServers": { "1mcpserver": { "command": "docker", "args": [ "run", "--rm", "-i", "-e", "DATADIR=/data", "-v", "${PWD}/db:/data", "<YOUR_DOCKER_IMAGE_HERE>" ] } } }

2B) uvx ๐Ÿ

Use this if you publish the server as a Python package and want a one-liner.

uvx <YOUR_PACKAGE_NAME> --local
{ "mcpServers": { "1mcpserver": { "command": "uvx", "args": ["<YOUR_PACKAGE_NAME>", "--local"] } } }

2C) npx ๐Ÿ“ฆ

Use this if you publish a Node wrapper / launcher and want a one-liner.

npx -y <YOUR_NPM_PACKAGE_NAME>
{ "mcpServers": { "1mcpserver": { "command": "npx", "args": ["-y", "<YOUR_NPM_PACKAGE_NAME>"] } } }

3) Local (from source) ๐Ÿงฉ

Clone this repo and run directly.

git clone https://github.com/particlefuture/MCPDiscovery.git cd MCPDiscovery uv sync uv run server.py --local
{ "mcpServers": { "1mcpserver": { "command": "/path/to/uv", "args": [ "--directory", "<PATH_TO_CLONED_REPO>", "run", "server.py", "--local" ] } } }

If your client supports remote url servers, you can use the Remote setup instead.

Optional: grant file-system access ๐Ÿ“

If you want your LLM to have file-system access, add an MCP filesystem server and point it at the directory you want to allow:

{ "mcpServers": { "file-system": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "~/"] } } }

Related MCP server: mcp-server-collector

Architecture ๐Ÿง 

There are two search modes:

For explicit requests like: โ€œI want an MCP server that handles payments.โ€

Returns a shortlist of relevant MCP servers.

For higher-level or complex goals like: โ€œBuild a website that analyzes other websites.โ€

The LLM breaks the goal into components/steps, finds MCP servers for each part, and if something is missing, it asks whether to:

  • ignore that part,

  • break it down further, or

  • implement it ourselves.

Deep Search stages:

  1. Planning โ€” identify servers, keys, and config changes

  2. Testing โ€” verify servers (via test_server_template_code)

  3. Acting โ€” execute the workflow using the configured servers


Change Log ๐Ÿ•’

  • July 31 2025: Upgrade to 0.2.0. Added agentic planning.

  • Dec 12 2025: Support for Gemini + Codex

  • Dec 13 2025: Easier local setup with docker, npm, and uvx.ย 

Future ๐Ÿ”ฎ

  • Better demo videos (new domain, narrated walkthrough)

  • Model Context Communication Protocol (MCCP): standard server-to-server messaging

  • Avoid calling tools with an internal_ prefix unless instructed

  • Improve MCP server database schema: server, description, url, config json, extra setup (docker/api key/etc)

Credits ๐Ÿ™

Data sources:

  • wong2/awesome-mcp-servers

  • metorial/mcp-containers

  • punkpeye/awesome-mcp-servers

  • modelcontextprotocol/servers

Published to:

Troubleshooting ๐Ÿงฐ

  • If using a venv and you get ModuleNotFoundError even after installing: delete the venv and recreate it.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/particlefuture/1mcpserver'

If you have feedback or need assistance with the MCP directory API, please join our Discord server