Skip to main content
Glama

CopilotMCP

docker-compose.yml821 B
version: '3.8' services: hello-mcp: build: . ports: - "8000:8000" environment: - MCP=hello command: ["uv", "run", "main.py", "--mcp", "${MCP}"] customer-mcp: build: . ports: - "8001:8000" environment: - MCP=customer_mcp command: ["uv", "run", "main.py", "--mcp", "${MCP}"] interview-mcp: build: . ports: - "8002:8000" environment: - MCP=interview_mcp command: ["uv", "run", "main.py", "--mcp", "${MCP}"] go-live-mcp: build: . ports: - "8003:8000" environment: - MCP=go_live_mcp command: ["uv", "run", "main.py", "--mcp", "${MCP}"] testing-e2e-mcp: build: . ports: - "8004:8000" environment: - MCP=testing_e2e_mcp command: ["uv", "run", "main.py", "--mcp", "${MCP}"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mehrshadshams/CopilotMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server