Skip to main content
Glama
docker-compose.yml710 B
version: '3.8' services: task-mcp-server: build: context: . dockerfile: Dockerfile image: task-mcp-server:latest container_name: task-mcp-server stdin_open: true tty: true restart: unless-stopped environment: - PYTHONUNBUFFERED=1 # For stdio-based MCP servers, we use stdin/stdout # No ports needed unless using HTTP transport # ports: # - "8000:8000" # Optional: mount volume for persistent data # volumes: # - ./data:/app/data # Resource limits (optional) deploy: resources: limits: cpus: '0.5' memory: 512M reservations: cpus: '0.25' memory: 256M

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gvbigdata/MCP-Github-Deployment'

If you have feedback or need assistance with the MCP directory API, please join our Discord server