Skip to main content
Glama

AI Customer Service MCP Server

by yakir-Yang
docker-compose.yml764 B
version: '3.8' services: mcp-server: build: . container_name: ai-customer-service-mcp ports: - "3000:3000" volumes: - ./data:/app/data:ro # 只读挂载数据目录 - ./logs:/app/logs # 日志目录 environment: - NODE_ENV=production - NODE_OPTIONS=--max-old-space-size=512 restart: unless-stopped healthcheck: test: ["CMD", "curl", "-f", "http://localhost:3000/health"] interval: 30s timeout: 10s retries: 3 start_period: 40s networks: - mcp-network logging: driver: "json-file" options: max-size: "10m" max-file: "3" networks: mcp-network: driver: bridge volumes: data: driver: local logs: driver: local

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yakir-Yang/mcp_demo'

If you have feedback or need assistance with the MCP directory API, please join our Discord server