Skip to main content
Glama

KNMI Weather MCP

by wolkwork
FROM python:3.12 ENV PYTHONUNBUFFERED=1 WORKDIR /app/ # Install uv # Ref: https://docs.astral.sh/uv/guides/integration/docker/#installing-uv COPY --from=ghcr.io/astral-sh/uv:0.6.4 /uv /uvx /bin/ # Place executables in the environment at the front of the path # Ref: https://docs.astral.sh/uv/guides/integration/docker/#using-the-environment ENV PATH="/app/.venv/bin:$PATH" # Compile bytecode # Ref: https://docs.astral.sh/uv/guides/integration/docker/#compiling-bytecode ENV UV_COMPILE_BYTECODE=1 # uv Cache # Ref: https://docs.astral.sh/uv/guides/integration/docker/#caching ENV UV_LINK_MODE=copy # Install dependencies # Ref: https://docs.astral.sh/uv/guides/integration/docker/#intermediate-layers RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --frozen --no-install-project ENV PYTHONPATH=/app COPY ./pyproject.toml ./uv.lock /app/ COPY ./src /app/src # Sync the project # Ref: https://docs.astral.sh/uv/guides/integration/docker/#intermediate-layers RUN --mount=type=cache,target=/root/.cache/uv \ uv sync CMD ["fastmcp", "run", "src/knmi_weather_mcp/server.py", "-t", "sse"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/wolkwork/knmi-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server