Skip to main content
Glama
docker-compose.yml1.57 kB
version: "3.8" services: mcp-process: image: digitaldefiance/mcp-process:latest container_name: mcp-process-server # Build configuration (for local development) build: context: . dockerfile: Dockerfile # Security settings security_opt: - no-new-privileges:true cap_drop: - ALL cap_add: - CHOWN - SETUID - SETGID read_only: false # Resource limits deploy: resources: limits: cpus: "2.0" memory: 2G reservations: cpus: "0.5" memory: 512M # Environment variables environment: - NODE_ENV=production - MCP_PROCESS_CONFIG_PATH=/app/config/mcp-process-config.json # Volumes volumes: # Mount configuration directory - ./config:/app/config:ro # Optional: Mount workspace directory for process execution # - ./workspace:/workspace:rw # Restart policy restart: unless-stopped # Logging logging: driver: "json-file" options: max-size: "10m" max-file: "3" # Network mode (stdio transport doesn't need network) network_mode: none # Health check healthcheck: test: ["CMD", "node", "-e", "process.exit(0)"] interval: 30s timeout: 5s retries: 3 start_period: 10s # User (non-root) user: "1001:1001" # Stdin/stdout for MCP protocol stdin_open: true tty: false # Optional: Development override # Create docker-compose.override.yml for local development settings

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Digital-Defiance/mcp-process'

If you have feedback or need assistance with the MCP directory API, please join our Discord server