Skip to main content
Glama

AMC MCP Server

by hi5d
docker-compose.yml•974 B
version: '3.8' services: amc-mcp: build: . image: amc-mcp:latest container_name: amc-mcp-server restart: unless-stopped ports: - "8000:8000" environment: - PYTHONPATH=/app/src - PYTHONUNBUFFERED=1 - MCP_LOG_LEVEL=INFO volumes: - ./data:/app/data:ro - ./logs:/app/logs networks: - amc-network healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8000/health"] interval: 30s timeout: 10s retries: 3 start_period: 40s # Optional: Add a simple web interface for testing amc-web: image: nginx:alpine container_name: amc-mcp-web restart: unless-stopped ports: - "8080:80" volumes: - ./web:/usr/share/nginx/html:ro - ./config/nginx.conf:/etc/nginx/conf.d/default.conf:ro networks: - amc-network depends_on: - amc-mcp networks: amc-network: driver: bridge volumes: amc-logs: driver: local

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hi5d/amc-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server