Skip to main content
Glama

Hello World MCP FastAPI Endpoint

by bankszach

Hello World MCP FastAPI Endpoint

This project exposes a minimal Model Context Protocol server backed by FastAPI. It registers both a resource and a tool that respond with a “Hello World” message so you can validate your MCP client integration end-to-end.

Setup

python -m venv .venv source .venv/bin/activate pip install -r requirements.txt

Running the server

uvicorn app:app --reload --port 8080

The readiness probe is available at http://127.0.0.1:8080/ and the MCP streamable HTTP endpoint is mounted at http://127.0.0.1:8080/mcp.

Trying it from an MCP client

Point your MCP-compatible LLM or SDK at http://127.0.0.1:8080/mcp. You should see:

  • resource://hello returning "Hello from the Model Context Protocol!"

  • say_hello tool returning a greeting.

These serve as a starting point for wiring up richer resources and tools.

Run with Docker (single command)

docker compose up --build

The server will be reachable on http://127.0.0.1:8080/ after the build completes.

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A minimal Model Context Protocol server built with FastAPI that provides a basic "Hello World" resource and tool. Serves as a starting point for building and validating MCP client integrations with richer resources and tools.

  1. Setup
    1. Running the server
      1. Trying it from an MCP client
    2. Run with Docker (single command)

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/bankszach/mcp-date-and-time'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server