Skip to main content
Glama
docker-compose.yml1.18 kB
version: '3.8' services: mcp-server: build: context: . restart: unless-stopped # 资源限制配置 deploy: resources: limits: memory: ${MEMORY_LIMIT:-2g} cpus: '${CPU_LIMIT:-2}' reservations: memory: ${MEMORY_RESERVATION:-512m} # 传统资源限制(兼容性) mem_limit: ${MEMORY_LIMIT:-2g} cpus: '${CPU_LIMIT:-2}' environment: - EXA_API_KEY=${EXA_API_KEY} - PORT=${PORT:-3111} - SMITHERY_TRANSPORT=shttp - SMITHERY_PORT=${PORT:-3111} - LOG_LEVEL=${LOG_LEVEL:-info} - DEBUG=${DEBUG:-false} - NODE_ENV=production # 日志配置 logging: driver: "json-file" options: max-size: "10m" max-file: "3" compress: "true" labels: "service=mcp-server" env: "production" # 端口映射 ports: - "${PORT:-3111}:${PORT:-3111}" # 网络配置 networks: - mcp-network # 安全配置 security_opt: - no-new-privileges:true read_only: true tmpfs: - /tmp - /var/tmp # 网络配置 networks: mcp-network: driver: bridge

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ZooTi9er/exa-mcp-server-personal'

If you have feedback or need assistance with the MCP directory API, please join our Discord server