Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP BMI Server Templatecalculate my BMI for 70kg and 175cm"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP BMI Server Template (FastAPI + SSE)
A minimal Model Context Protocol (MCP) server exposing a single tool: bmiCalculator.
Discovery endpoint (SSE):
GET /mcpInvocation endpoint:
POST /invokeHealth check:
GET /healthz
Quick Start (local)
pip install -r requirements.txt
export API_KEY=changeme # optional
uvicorn main:app --host 0.0.0.0 --port 10000Test:
# Discover tools (SSE)
curl -N -H "Accept: text/event-stream" -H "x-api-key: changeme" http://localhost:10000/mcp
# Invoke tool
curl -X POST http://localhost:10000/invoke \
-H "Content-Type: application/json" \
-H "x-api-key: changeme" \
-d '{"tool":"bmiCalculator","params":{"weight":70,"height":175,"unit":"cm"}}'Docker
docker build -t mcp-bmi .
docker run -p 10000:10000 -e API_KEY=changeme mcp-bmiDeploy to Render (example)
Create a new Web Service from this repo
Set environment variable
API_KEY(optional but recommended)The service will start with the Dockerfile
Your base URL will look like:
https://<your-service>.onrender.comUse the MCP SSE endpoint:
https://<your-service>.onrender.com/mcpMCP Host config example
{
"mcpServers": {
"bmiAgent": {
"url": "https://<your-service>.onrender.com/mcp",
"headers": {
"x-api-key": "changeme"
}
}
}
}Project structure
.
├── main.py
├── requirements.txt
├── Dockerfile
├── .gitignore
└── README.mdNotes
SSE (
text/event-stream) is used for discovery. If your host requires WebSocket, add a/wsendpoint.Extend by adding more tools in
MCPHello.toolsand dispatching in/invoke.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.