Skip to main content
Glama

MCP Crash Course

Read file: pyproject.toml Ran tool Read file: mcpserver/requirements.txt Here’s a brief and clear README for your project, covering the MCP server, usage, and Docker integration:


MCP Crash Course

This project demonstrates how to build and interact with an MCP (Model Context Protocol) server using Python, with support for weather alerts and conversational agents. It also includes Docker support for easy deployment.

Features

  • Weather Alerts Tool:
    Get real-time weather alerts for any US state using the National Weather Service API.
  • Conversational Agent:
    Uses langchain-groq and MCP tools to enable chat-based interactions with memory.Model Used: llama-3.3-70b-versatile (via ChatGroq)
  • MCP Server & Client:
    Includes both server and client implementations for tool invocation and chat.
  • Docker Support:
    Easily build and run the server in a containerized environment.

Project Structure

. ├── main.py ├── pyproject.toml ├── README.md ├── server/ │ ├── weather.py # Weather tool implementation │ ├── client.py # Chat client using MCPAgent │ └── weather.json # MCP tool configuration ├── mcpserver/ │ ├── server.py # Example MCP server │ ├── client-stdio.py # Example stdio client │ ├── client-sse.py # Example SSE client │ ├── Dockerfile # Docker setup for server │ └── requirements.txt # Minimal requirements for Docker └── .venv/ # Virtual environment (ignored)

Getting Started

1. Clone the repository

git clone https://github.com/Muhammadaffan05/MCP.git cd MCP
uv venv --python=python3.13 source .venv/bin/activate uv add -r pyproject.toml

3. Configure Environment Variables

Create a .env file in the root or set the following in your environment:

  • GROQ_API_KEY (for language model access)

4. Run the MCP Server

python mcpserver/server.py

5. Run the Client

python server/client.py

Or try the stdio client:

python mcpserver/client-stdio.py

Docker Usage

Build and run the server in Docker:

cd mcpserver docker build -t mcp-server . docker run -p 8000:8000 mcp-server
  • The Dockerfile uses Python 3.11, installs dependencies with uv, and runs the MCP server.

Dependencies

  • Python 3.13+ (or 3.11+ for Docker)
  • httpx
  • langchain-groq
  • mcp-use
  • mcp[cli]
  • nest-asyncio

All dependencies are managed via pyproject.toml and uv.

Notes

  • .venv/, __pycache__/, and other generated files are gitignored.
  • For best results, use the latest stable Python version.
  • Make sure to set up your API keys and environment variables as needed.

Let me know if you want this saved to your README.md or need any changes!

-
security - not tested
F
license - not found
-
quality - not tested

A Python-based MCP (Model Context Protocol) server that enables weather alerts and conversational AI interactions, with Docker support for easy deployment.

  1. Features
    1. Project Structure
      1. Getting Started
        1. 1. Clone the repository
        2. 2. Set up a virtual environment (recommended)
        3. 3. Configure Environment Variables
        4. 4. Run the MCP Server
        5. 5. Run the Client
      2. Docker Usage
        1. Dependencies
          1. Notes

            Related MCP Servers

            • -
              security
              F
              license
              -
              quality
              A simple demonstration project for the Model Control Protocol (MCP) server that provides tools for AI assistants to fetch news articles, perform calculations, retrieve weather data, and generate personalized greetings.
              Last updated -
              Python
            • A
              security
              F
              license
              A
              quality
              A simple demonstration server for the MCP Python SDK that provides weather alerts for locations, allowing users to query weather information through Claude Desktop or Cursor.
              Last updated -
              1
              Python
            • -
              security
              A
              license
              -
              quality
              A Python-based MCP server that provides real-time weather information using the National Weather Service API, supporting both synchronous and asynchronous HTTP requests.
              Last updated -
              Python
              MIT License
            • -
              security
              F
              license
              -
              quality
              A Model Context Protocol (MCP) server that enables AI assistants and LLMs to access real-time weather data and forecasts by connecting to the OpenWeatherMap API.
              Last updated -
              Python
              • Apple

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/Muhammadaffan05/MCP'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server