Skip to main content
Glama

MedX MCP Server

by yepdama
docker-compose.yml823 B
version: '3.8' services: mcp-server: build: . container_name: mcp-server-mock-jivi ports: - "8000:8000" environment: - OPENAI_API_KEY=${OPENAI_API_KEY} - MCP_SERVER_TOKEN=${MCP_SERVER_TOKEN:-super-secret-token} - OPENAI_MODEL=${OPENAI_MODEL:-gpt-4o-mini} - LOG_LEVEL=${LOG_LEVEL:-INFO} - LOG_FILE=${LOG_FILE:-logs/server.log} - HOST=${HOST:-0.0.0.0} - PORT=${PORT:-8000} volumes: - ./logs:/app/logs # Persist logs directory - ./manifest.json:/app/manifest.json # Persist manifest restart: unless-stopped healthcheck: test: ["CMD-SHELL", "python -c 'import urllib.request; urllib.request.urlopen(\"http://localhost:8000/healthz\")' || exit 1"] interval: 30s timeout: 10s retries: 3 start_period: 10s

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yepdama/medical-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server