Skip to main content
Glama

MCP Hub

by arberrexhepi
README.md5.49 kB
# MCP Burst **An Express app with AI capabilities powered by MCP (Model Context Protocol)** MCP Burst supports multiple server types and includes a built-in chat interface for testing. In a way acting as a bridge for MCP Clients to an array of tools and MCP servers/server gateways. ## What it does - **Chat Integration**: Works with Ollama, OpenAI, and Docker Model Runner - **MCP Server Support**: Runs streamablehttp servers, stdio servers such as Docker MCP Gateway server, and n8n MCP trigger node URLs - **Custom Tools**: Create discoverable tools at the `/mcp` endpoint - **Multi-Client Support**: Use `MCP_BURST_CLIENT=true` in .env to serve multiple clients simultaneously ## Quick Status Check - **Session debugging**: `curl http://localhost:4000/status` - **Note**: Multi-client sessions are still being thoroughly tested **NOTE: You might have to upgrade the security if you want to use this in production. Otherwise, test and be patient until a secure merge** ## Running MCP Burst in MCP Clients **VSCode**: 1. In your MCP Burst folder, run the hub: ```bash npm run start:hub ``` 2. Create .vscode/mcp.json in your workspces for example, using this config: ```json { "servers": { "mcp-burst": { "type": "http", "url": "http://127.0.0.1:4000", } } } ``` **Claude Desktop**: Update your claude_desktop_config.json to include: ```json { "mcpServers": { "mcp-burst": { "command": "node", "args": ["/path/to/mcpburst/hub/stdio-client-hub-entry.js"] } } } ``` ## Prebuilt Demo Chatbot Agent App In your MCP Burst folder, run these commands: 1. ```bash npm run start:hub ``` 2. ```bash npm run start:server ``` ## Table of Contents - [Features](#features) - [Prerequisites](#prerequisites) - [Installation](#installation) - [Configuration](#configuration) - [Usage](#usage) - [Directory Structure](#directory-structure) - [Scripts](#scripts) - [License](#license) ## Features - Streamable HTTP MCP transport server built on Model Context Protocol SDK - Serve MCP Burst as an http and/or stdio client *(tested with Claude Desktop, VS Code)* - Built-in demo tools: 'echo' and 'update_session_planner' - Express façade handling JSON-RPC at `/mcp` and health checks at `/health` - Built in Planner Tool (Required for built in chatbot, but not necessarily for MCP Clients.) - OPTIONAL: Built in Gamification (ie positive reinforcement for successfully completing tasks, worked well with Claude Desktop so I kept it in the repo) - Demo Chatbot Agent App to test MCP integration, and now session planner resource to execute multiple tools. ## Prerequisites - Node.js v16 or higher - npm (included with Node.js) ## Installation 1. Clone the repository: ```sh git clone https://github.com/arberrexhepi/mcpburst.git cd mcpburst ``` 2. Install root dependencies in root: ```sh npm install ``` 3. Install hub and server dependencies: ```sh cd hub && npm install && cd .. ``` ## Configuration Create environment variable files in both `hub/` and `server/` directories: ### hub/.env ```ini PORT=4000 HOST=localhost MCP_REQUIRE_AUTH=false # set to 'true' to require Bearer auth NOTE: MCP_BURST_CLIENT=true #set to false if you don't want it to be discoverable ``` ### server/.env ```ini OPENAI_API_KEY=your_openai_api_key MCP_ENDPOINT=http://127.0.0.1:4000/mcp PORT = 3500 STRATEGY=DMR DOCKER_MODEL_RUNNER_URL=http://localhost:12434/engines/llama.cpp/v1/chat/completions OLLAMA_URL=http://localhost:11434/v1/chat/completions ``` ## Usage 1. Build and start the MCP hub server: ```sh npm run start:hub ``` 2. Start the front-end proxy server: ```sh npm run start:server ``` 3. Open your browser at [http://localhost:3000](http://localhost:3000) to access the chat UI. 4. The hub JSON-RPC endpoint is available at `http://localhost:4000/mcp`. ## Directory Structure ``` ├── LICENSE ├── package.json # root package config and scripts ├── README.md ├── hub/ # MCP hub server │ ├── package.json │ ├── tsconfig.json │ ├── dist/ # TypeScript sources output to dist/ │ ├── installToolsFeature.ts │ ├── installResourcesFeature.ts │ ├── stdio-client-hub-entry.js │ ├── bridgeBuilder.ts │ ├── bridgeHttp.ts │ ├── bridgeStdio.ts │ ├── hub.ts │ └── bridges/ # sample tool definitions │ └── hub.yaml └── server/ # proxy and front-end assets ├── index.js ├── extractContent.js ├── public/ │ ├── index.html │ ├── css/ │ │ └── styles.css │ └── js/ │ └── chat.js └── agent_functions/ ├── index.js ├── llmClient.js ├── mcpClient.js ├── planExecutor.js └── README.MD ``` ## Scripts All scripts are defined in the root `package.json`: - `npm run build` : Compile TypeScript in `hub` and copy bridge files - `npm run copy:bridges`: Copy bridge YAML definitions to `hub/dist` - `npm run start:hub` : Build then run the MCP hub server - `npm run start:server`: Run the Express static server with chat UI ## License - This project is licensed under the Apache License 2.0. See the [LICENSE](LICENSE) file for details.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/arberrexhepi/mcp-hub'

If you have feedback or need assistance with the MCP directory API, please join our Discord server