Skip to main content
Glama

docs-mcp-server

docker-compose.yml2.32 kB
# Docker Compose configuration for Docs MCP Server # # Scaling setup with separate services: # - worker: Handles documentation processing # - mcp: MCP server endpoint for AI tools # - web: Web interface for management # services: # Worker service - handles the actual documentation processing worker: image: ghcr.io/arabold/docs-mcp-server:latest build: context: . dockerfile: Dockerfile command: ["worker", "--host", "0.0.0.0", "--port", "8080"] container_name: docs-mcp-worker restart: unless-stopped env_file: - .env volumes: - docs-mcp-data:/data healthcheck: test: 'node -e ''require("net").connect(8080, "127.0.0.1").on("connect",()=>process.exit(0)).on("error",()=>process.exit(1))''' interval: 5s timeout: 3s retries: 10 start_period: 10s deploy: resources: limits: memory: 2G reservations: memory: 1G # MCP server - provides AI tool integration endpoint mcp: image: ghcr.io/arabold/docs-mcp-server:latest build: context: . dockerfile: Dockerfile command: [ "mcp", "--protocol", "http", "--host", "0.0.0.0", "--port", "6280", "--server-url", "http://worker:8080/api", ] container_name: docs-mcp-server restart: unless-stopped ports: - "6280:6280" env_file: - .env depends_on: worker: condition: service_healthy deploy: resources: limits: memory: 512M reservations: memory: 256M # Web interface - provides browser-based management web: image: ghcr.io/arabold/docs-mcp-server:latest build: context: . dockerfile: Dockerfile command: [ "web", "--host", "0.0.0.0", "--port", "6281", "--server-url", "http://worker:8080/api", ] container_name: docs-mcp-web restart: unless-stopped ports: - "6281:6281" env_file: - .env depends_on: worker: condition: service_healthy deploy: resources: limits: memory: 512M reservations: memory: 256M volumes: docs-mcp-data: name: docs-mcp-data

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/arabold/docs-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server