Skip to main content
Glama
docker-compose.yml801 B
version: "3.9" services: mcp-server: build: context: . args: # Set this to your OpenAPI spec URL to download at build time OPENAPI_SPEC_URL: "" container_name: mcp-openapi-server restart: unless-stopped env_file: - .env environment: - PYTHONUNBUFFERED=1 - PYTHONPATH=/app/vendor # Native SSE transport from FastMCP - MCP_TRANSPORT=sse - MCP_HOST=0.0.0.0 - MCP_PORT=8000 ports: - "8000:8000" healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8000/sse"] interval: 30s timeout: 10s retries: 3 start_period: 10s # Optional: resource limits for production # deploy: # resources: # limits: # memory: 512M # cpus: '0.5'

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jesusperezdeveloper/mcp_openapi_template'

If you have feedback or need assistance with the MCP directory API, please join our Discord server