Skip to main content
Glama
Lufeezhang

MCP BMI Server Template

by Lufeezhang

MCP BMI Server Template (FastAPI + SSE)

A minimal Model Context Protocol (MCP) server exposing a single tool: bmiCalculator.

  • Discovery endpoint (SSE): GET /mcp

  • Invocation endpoint: POST /invoke

  • Health check: GET /healthz

Quick Start (local)

pip install -r requirements.txt export API_KEY=changeme # optional uvicorn main:app --host 0.0.0.0 --port 10000

Test:

# Discover tools (SSE) curl -N -H "Accept: text/event-stream" -H "x-api-key: changeme" http://localhost:10000/mcp # Invoke tool curl -X POST http://localhost:10000/invoke \ -H "Content-Type: application/json" \ -H "x-api-key: changeme" \ -d '{"tool":"bmiCalculator","params":{"weight":70,"height":175,"unit":"cm"}}'

Docker

docker build -t mcp-bmi . docker run -p 10000:10000 -e API_KEY=changeme mcp-bmi

Deploy to Render (example)

  • Create a new Web Service from this repo

  • Set environment variable API_KEY (optional but recommended)

  • The service will start with the Dockerfile

Your base URL will look like:

https://<your-service>.onrender.com

Use the MCP SSE endpoint:

https://<your-service>.onrender.com/mcp

MCP Host config example

{ "mcpServers": { "bmiAgent": { "url": "https://<your-service>.onrender.com/mcp", "headers": { "x-api-key": "changeme" } } } }

Project structure

. ├── main.py ├── requirements.txt ├── Dockerfile ├── .gitignore └── README.md

Notes

  • SSE (text/event-stream) is used for discovery. If your host requires WebSocket, add a /ws endpoint.

  • Extend by adding more tools in MCPHello.tools and dispatching in /invoke.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Lufeezhang/bmi-MCP-fastapi'

If you have feedback or need assistance with the MCP directory API, please join our Discord server