Skip to main content
Glama

Streamable HTTP MCP Server

by tevinric
docker-compose.yml696 B
version: '3.8' services: mcp-server: build: . ports: - "8000:8000" healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8000/health"] interval: 30s timeout: 10s retries: 3 start_period: 40s restart: unless-stopped client: build: . depends_on: - mcp-server environment: - AZURE_OPENAI_API_KEY=${AZURE_OPENAI_API_KEY} - AZURE_OPENAI_ENDPOINT=${AZURE_OPENAI_ENDPOINT} - AZURE_OPENAI_DEPLOYMENT_NAME=${AZURE_OPENAI_DEPLOYMENT_NAME:-gpt-4o} - AZURE_OPENAI_API_VERSION=${AZURE_OPENAI_API_VERSION:-2024-02-01} command: ["python", "client.py"] restart: "no" profiles: - client

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tevinric/mcp-protocol-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server