Skip to main content
Glama

SSE-based Server for MCP

This demonstrates a working pattern for SSE-based MCP servers.

Usage

uv run weather.py

Why?

This means the MCP server can now be some running process that agents (clients) connect to, use, and disconnect from whenever and wherever they want. In other words, an SSE-based server and clients can be decoupled processes (potentially even, on decoupled nodes). This is different and better fits "cloud-native" use-cases compared to the STDIO-based pattern where the client itself spawns the server as a subprocess.

Server

weather.py is a SSE-based MCP server that presents some tools based on the National Weather Service APIs. Adapted from the MCP docs' example STDIO server implementation.

By default, server runs on 0.0.0.0:8080, but is configurable with command line arguments like:

uv run weather.py --host <your host> --port <your port>

Docker

A Dockerfile is provided for easy containerization of the server.

Build the image:

docker build -t weather-mcp-server .

Run the container:

docker run -d --name weather-mcp-server -p 8080:8080 weather-mcp-server

The server will be accessible at http://localhost:8080 (or the appropriate host if not running locally).

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cemremengu/weather-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server