DOCKER.mdā¢6.26 kB
# Docker Files Summary
This document summarizes all Docker-related files in the Weather MCP project.
## š³ Core Docker Files
### `Dockerfile`
- **Purpose**: Production container image for Weather MCP server
- **Features**: Multi-stage build, security hardening, non-root user
- **Base**: Python 3.13-slim
- **Command**: `python main.py server --host 0.0.0.0 --port 8000`
### `docker-compose.yml`
- **Purpose**: Main orchestration file for complete weather intelligence system
- **Services**:
- `weather-streamlit`: Streamlit chat interface (port 8501) - **Primary UI**
- `weather-server`: Weather MCP API + Agent Coordination Hub (port 8000)
- `ollama`: Ollama LLM server with AI models (port 11434)
- `ollama-setup`: Automated model downloader (llama3, phi3)
- `weather-demo`: Testing client (optional profile)
- **Networks**: `weather-mcp-network` (internal service communication)
- **Volumes**: `ollama_data` for persistent model storage
### `docker-compose.prod.yml`
- **Purpose**: Production overrides
- **Features**: Resource limits, security settings, health checks
- **Usage**: `docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d`
### `docker-compose.dev.yml`
- **Purpose**: Development overrides
- **Features**: Live reload, debug logging, relaxed security
- **Usage**: `docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d`
## š Configuration Files
### `.env`
- **Purpose**: Docker environment variables
- **Settings**: Server config, Ollama integration, security settings
- **Note**: Pre-configured for Docker networking (`OLLAMA_HOST=http://ollama:11434`)
### `.env.production`
- **Purpose**: Production environment template
- **Usage**: Copy from `.env.example` and customize for production
### `.dockerignore`
- **Purpose**: Exclude files from Docker build context
- **Excludes**: `__pycache__`, `.git`, development files, logs
## š Management Scripts
### `start-docker.sh`
- **Purpose**: Comprehensive startup script with options
- **Features**: Environment selection, demo mode, verbose logging
- **Usage**: `./start-docker.sh [--prod|--dev] [--demo] [--verbose] [--build]`
### `stop-docker.sh`
- **Purpose**: Clean shutdown with cleanup options
- **Features**: Graceful stop, cleanup modes, data preservation
- **Usage**: `./stop-docker.sh [--cleanup] [--remove-data]`
### `validate-docker.sh`
- **Purpose**: Environment validation and setup
- **Features**: Docker version check, memory validation, compose wrapper
- **Usage**: `./validate-docker.sh`
## š§ Generated Files
### `docker-compose-wrapper.sh`
- **Purpose**: Auto-generated by `validate-docker.sh`
- **Function**: Abstracts docker-compose vs docker compose v2 differences
- **Note**: Automatically created during validation
## š Directory Structure for Docker
```
weather/
āāā docker-compose.yml # Main orchestration
āāā docker-compose.prod.yml # Production overrides
āāā docker-compose.dev.yml # Development overrides
āāā Dockerfile # Container image definition
āāā .dockerignore # Build exclusions
āāā .env # Docker environment config
āāā .env.production # Production template
āāā start-docker.sh # Startup script
āāā stop-docker.sh # Shutdown script
āāā validate-docker.sh # Environment validation
āāā requirements.txt # Python dependencies
āāā [application files]
```
## š Network Architecture
```
Host System (localhost)
āāā 8501 ā weather-streamlit:8501 (Streamlit Chat Interface)
āāā 8000 ā weather-server:8000 (Weather API + Agent Hub)
āāā 11434 ā ollama:11434 (Ollama LLM Engine)
āāā Docker Network: weather-mcp-network
āāā weather-streamlit (web interface)
āāā weather-server (MCP API + agents)
āāā ollama (LLM service)
āāā ollama-setup (model downloader)
āāā weather-demo (testing - optional)
```
## šÆ **Current System Status**
### Active Services (as of latest update)
```bash
$ docker ps
CONTAINER ID IMAGE PORTS NAMES
75f8eb1fc7f4 weather-streamlit-ui 0.0.0.0:8501->8501/tcp weather-streamlit
9621ce1e1d44 weather-weather-server 0.0.0.0:8000->8000/tcp weather-mcp
a71d0f714abf ollama/ollama:latest 0.0.0.0:11434->11434/tcp weather-ollama
```
### Service Health Status
- ā
**Streamlit UI**: http://localhost:8501 (Chat Interface)
- ā
**Weather API**: http://localhost:8000 (Agent Coordination Hub)
- ā
**Ollama LLM**: http://localhost:11434 (AI Models: llama3, phi3)
- ā
**System Health**: All services passing health checks
## š **Service Dependencies**
```
āāāāāāāāāāāāāāāāāāā
ā Streamlit UI ā āāāāāāā
ā (Port 8501) ā ā
āāāāāāāāāāāāāāāāāāā ā
ā¼
āāāāāāāāāāāāāāāāāāā
ā Weather Server ā āāāāāāā
ā (Port 8000) ā ā
ā + Agent Hub ā ā
āāāāāāāāāāāāāāāāāāā ā
ā¼
āāāāāāāāāāāāāāāāāāā
ā Ollama LLM ā
ā (Port 11434) ā
ā Models: llama3 ā
āāāāāāāāāāāāāāāāāāā
```
## š” Quick Reference
```bash
# Complete setup from scratch
git clone <repo> && cd weather-mcp-agent
chmod +x *.sh
./validate-docker.sh
./start-docker.sh --verbose
# Development workflow
./start-docker.sh --dev --demo
# Make changes to code...
./start-docker.sh --dev --build # Rebuild after changes
# Production deployment
./start-docker.sh --prod
curl http://localhost:8000/health
# Cleanup
./stop-docker.sh --cleanup
```