Skip to main content
Glama
docker-compose.ymlโ€ข882 B
version: '3.8' services: guardian-mcp: build: context: . dockerfile: Dockerfile image: kalmars/guardian-mcp:latest container_name: guardian-mcp restart: unless-stopped # MCP servers use stdio, so we use stdin_open and tty stdin_open: true tty: true # Environment variables (optional) environment: - NODE_ENV=production # Mount project directory for scanning (optional) # Uncomment and adjust path if you want to scan specific projects # volumes: # - /path/to/your/projects:/projects:ro # Resource limits (optional) deploy: resources: limits: cpus: '1' memory: 512M reservations: cpus: '0.5' memory: 256M # Logging configuration logging: driver: "json-file" options: max-size: "10m" max-file: "3"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Kalvisan/guardian-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server