Skip to main content
Glama
docker-compose.yml796 B
version: '3.8' services: openapi-mcp: build: . ports: - "8080:8080" environment: - LOG_FORMAT=pretty - LOG_LEVEL=debug command: - "--spec-url" - "https://petstore.swagger.io/v3/openapi.json" - "--upstream-url" - "https://petstore.swagger.io/v3" healthcheck: test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/health"] interval: 30s timeout: 3s retries: 3 start_period: 5s # Example with local spec file # openapi-mcp-local: # build: . # ports: # - "8081:8080" # volumes: # - ./examples/my-api.yaml:/spec.yaml:ro # command: # - "--spec-file" # - "/spec.yaml" # - "--upstream-url" # - "https://api.example.com"

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/procoders/openapi-mcp-ts'

If you have feedback or need assistance with the MCP directory API, please join our Discord server