Skip to main content
Glama

MCP Weather Server

by Meloyg
docker-compose.yml1.12 kB
version: "3.8" services: mcp-weather-server: build: . container_name: mcp-weather-server restart: unless-stopped environment: - NODE_ENV=production # For MCP stdio communication, we need to run in interactive mode stdin_open: true tty: true # Mount logs directory for persistent logging volumes: - ./logs:/app/logs # Network mode for MCP communication network_mode: host # Resource limits deploy: resources: limits: memory: 256M cpus: "0.5" reservations: memory: 128M cpus: "0.25" # Development service with hot reload and debugging mcp-weather-server-dev: build: context: . dockerfile: Dockerfile.dev container_name: mcp-weather-server-dev restart: unless-stopped environment: - NODE_ENV=development stdin_open: true tty: true ports: - "9229:9229" # Node.js inspector port - "3000:3000" # Application port (if needed) volumes: - .:/app - /app/node_modules - ./logs:/app/logs profiles: - dev

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Meloyg/life-assitant-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server