Skip to main content
Glama

MCP Weather Project

A Model Context Protocol (MCP) server implementation that provides weather alerts via the National Weather Service (NWS) API. This project includes both a FastMCP server and a LangChain-based client with memory capabilities.

Features

  • Weather Alerts: Fetch active weather alerts for any US state using the NWS API.

  • Echo Resource: A simple resource that echoes back messages.

  • Greeting Prompt: A customizable greeting prompt generator.

  • Interactive Client: A CLI-based chat client powered by Groq's Llama 3.3 model with conversation memory.

Prerequisites

  • Python 3.12 or higher

  • uv (recommended for dependency management)

  • A Groq API Key for the client.

Installation

  1. Clone the repository:

    git clone <repository_url> cd mcpfile
  2. Install dependencies: Using uv (recommended):

    uv sync

    Or using pip:

    pip install -r requirements.txt

    (Note: You may need to generate a

  3. Set up Environment Variables: Create a .env file in the root directory and add your Groq API key:

    GROQ_API_KEY=your_groq_api_key_here

Usage

Running the Entry Point

The main.py is a simple entry point script that prints a welcome message.

uv run main.py # OR python main.py

Running the interactive Client

The client connects to the weather server and allows you to interact with it using natural language.

  1. Ensure the server configuration in server/weather.json is correct (it points to server/weather.py).

  2. Run the client:

    uv run server/client.py # OR if using a virtual environment directly: # python server/client.py
  3. Example Interaction:

    You: Check weather alerts for TX Assistant: Checking weather alerts for Texas... [Agent responds with alerts]

Running the MCP Server Standalone

You can run the MCP server directly using uv. This is useful for inspection or debugging with the MCP Inspector.

uv run --with mcp[cli] mcp run server/weather.py

Configuration Verification

Ensure that server/weather.json points to the correct absolute path of your server/weather.py file.

{ "mcpServers": { "weather": { "command": "uv", "args": [ "run", "--with", "mcp[cli]", "mcp", "run", "/your/absolute/path/to/mcp/mcpfile/server/weather.py" ] } } }

Project Structure

  • server/weather.py: The main MCP server implementation using FastMCP. Defines tools (get_alerts), resources, and prompts.

  • server/client.py: An MCP client implementation using LangChain and ChatGroq. Handles the interactive chat session.

  • server/weather.json: Configuration file for the MCP client to locate the server.

  • main.py: Simple entry point script.

  • pyproject.toml: Project configuration and dependencies.

Tools Available

  • get_alerts(state: str): Get active weather alerts for a US state (e.g., "CA", "NY").

Resources

  • echo://{message}: Echoes a message.

Prompts

  • greet_user(name: str, style: str): Generates a greeting in a specified style ("friendly", "formal", or "casual").

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/PRIYADHARSANK/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server