Skip to main content
Glama

Agentic MCP Weather System

by Shivbaj
DOCKER.md•6.26 kB
# Docker Files Summary This document summarizes all Docker-related files in the Weather MCP project. ## 🐳 Core Docker Files ### `Dockerfile` - **Purpose**: Production container image for Weather MCP server - **Features**: Multi-stage build, security hardening, non-root user - **Base**: Python 3.13-slim - **Command**: `python main.py server --host 0.0.0.0 --port 8000` ### `docker-compose.yml` - **Purpose**: Main orchestration file for complete weather intelligence system - **Services**: - `weather-streamlit`: Streamlit chat interface (port 8501) - **Primary UI** - `weather-server`: Weather MCP API + Agent Coordination Hub (port 8000) - `ollama`: Ollama LLM server with AI models (port 11434) - `ollama-setup`: Automated model downloader (llama3, phi3) - `weather-demo`: Testing client (optional profile) - **Networks**: `weather-mcp-network` (internal service communication) - **Volumes**: `ollama_data` for persistent model storage ### `docker-compose.prod.yml` - **Purpose**: Production overrides - **Features**: Resource limits, security settings, health checks - **Usage**: `docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d` ### `docker-compose.dev.yml` - **Purpose**: Development overrides - **Features**: Live reload, debug logging, relaxed security - **Usage**: `docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d` ## šŸ“‹ Configuration Files ### `.env` - **Purpose**: Docker environment variables - **Settings**: Server config, Ollama integration, security settings - **Note**: Pre-configured for Docker networking (`OLLAMA_HOST=http://ollama:11434`) ### `.env.production` - **Purpose**: Production environment template - **Usage**: Copy from `.env.example` and customize for production ### `.dockerignore` - **Purpose**: Exclude files from Docker build context - **Excludes**: `__pycache__`, `.git`, development files, logs ## šŸš€ Management Scripts ### `start-docker.sh` - **Purpose**: Comprehensive startup script with options - **Features**: Environment selection, demo mode, verbose logging - **Usage**: `./start-docker.sh [--prod|--dev] [--demo] [--verbose] [--build]` ### `stop-docker.sh` - **Purpose**: Clean shutdown with cleanup options - **Features**: Graceful stop, cleanup modes, data preservation - **Usage**: `./stop-docker.sh [--cleanup] [--remove-data]` ### `validate-docker.sh` - **Purpose**: Environment validation and setup - **Features**: Docker version check, memory validation, compose wrapper - **Usage**: `./validate-docker.sh` ## šŸ”§ Generated Files ### `docker-compose-wrapper.sh` - **Purpose**: Auto-generated by `validate-docker.sh` - **Function**: Abstracts docker-compose vs docker compose v2 differences - **Note**: Automatically created during validation ## šŸ“ Directory Structure for Docker ``` weather/ ā”œā”€ā”€ docker-compose.yml # Main orchestration ā”œā”€ā”€ docker-compose.prod.yml # Production overrides ā”œā”€ā”€ docker-compose.dev.yml # Development overrides ā”œā”€ā”€ Dockerfile # Container image definition ā”œā”€ā”€ .dockerignore # Build exclusions ā”œā”€ā”€ .env # Docker environment config ā”œā”€ā”€ .env.production # Production template ā”œā”€ā”€ start-docker.sh # Startup script ā”œā”€ā”€ stop-docker.sh # Shutdown script ā”œā”€ā”€ validate-docker.sh # Environment validation ā”œā”€ā”€ requirements.txt # Python dependencies └── [application files] ``` ## 🌐 Network Architecture ``` Host System (localhost) ā”œā”€ā”€ 8501 → weather-streamlit:8501 (Streamlit Chat Interface) ā”œā”€ā”€ 8000 → weather-server:8000 (Weather API + Agent Hub) ā”œā”€ā”€ 11434 → ollama:11434 (Ollama LLM Engine) └── Docker Network: weather-mcp-network ā”œā”€ā”€ weather-streamlit (web interface) ā”œā”€ā”€ weather-server (MCP API + agents) ā”œā”€ā”€ ollama (LLM service) ā”œā”€ā”€ ollama-setup (model downloader) └── weather-demo (testing - optional) ``` ## šŸŽÆ **Current System Status** ### Active Services (as of latest update) ```bash $ docker ps CONTAINER ID IMAGE PORTS NAMES 75f8eb1fc7f4 weather-streamlit-ui 0.0.0.0:8501->8501/tcp weather-streamlit 9621ce1e1d44 weather-weather-server 0.0.0.0:8000->8000/tcp weather-mcp a71d0f714abf ollama/ollama:latest 0.0.0.0:11434->11434/tcp weather-ollama ``` ### Service Health Status - āœ… **Streamlit UI**: http://localhost:8501 (Chat Interface) - āœ… **Weather API**: http://localhost:8000 (Agent Coordination Hub) - āœ… **Ollama LLM**: http://localhost:11434 (AI Models: llama3, phi3) - āœ… **System Health**: All services passing health checks ## šŸ”„ **Service Dependencies** ``` ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ Streamlit UI │ ──────┐ │ (Port 8501) │ │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ │ ā–¼ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ Weather Server │ ──────┐ │ (Port 8000) │ │ │ + Agent Hub │ │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ │ ā–¼ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ Ollama LLM │ │ (Port 11434) │ │ Models: llama3 │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ``` ## šŸ’” Quick Reference ```bash # Complete setup from scratch git clone <repo> && cd weather-mcp-agent chmod +x *.sh ./validate-docker.sh ./start-docker.sh --verbose # Development workflow ./start-docker.sh --dev --demo # Make changes to code... ./start-docker.sh --dev --build # Rebuild after changes # Production deployment ./start-docker.sh --prod curl http://localhost:8000/health # Cleanup ./stop-docker.sh --cleanup ```

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Shivbaj/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server