Skip to main content
Glama

mcp-hub

A self-hosted remote MCP server that provides reusable prompts and conventions across AI tools.

Structure

modules/ └── dev/ └── python.py # Python/uv conventions → prompt: dev_python_uv

Each domain is a subfolder under modules/. Each file exposes a router (a FastMCP instance) that gets mounted in main.py.

Configuration

Copy .env.example to .env and adjust as needed:

MCP_HOST=0.0.0.0 MCP_PORT=8080

Run

uv run main.py

Connect

Add as an MCP server in your AI tool using:

  • Transport: Streamable HTTP

  • URL: http://<host>:<port>/mcp

Deploy on Linux

Option 1 — screen (recommended): lets you detach and reattach to the session anytime.

screen -S mcp-hub uv run main.py # Ctrl+A then D to detach # Reattach later: screen -r mcp-hub

Option 2 — nohup: fire-and-forget, no reattach.

nohup uv run main.py > mcp-hub.log 2>&1 & echo $! > mcp-hub.pid # save PID to stop it later kill $(cat mcp-hub.pid) # stop

Add a new module

  1. Create modules/<domain>/<topic>.py with a router = FastMCP(...) and @router.prompt functions

  2. Mount it in main.py: mcp.mount(router, namespace="<domain>")

Prompts are namespaced as <namespace>_<prompt_name> (e.g. dev_python_uv). Personal MCP server hub — extensible context and conventions for AI coding tools

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/shanezchang/mcp-hub'

If you have feedback or need assistance with the MCP directory API, please join our Discord server