Skip to main content
Glama

YouTube Transcript MCP Server

by suckerfish
compose.yaml1.08 kB
services: yttranscript-mcp: build: context: . dockerfile: Dockerfile image: yttranscript-mcp:latest pull_policy: build container_name: yttranscript-mcp environment: - YT_TRANSCRIPT_SERVER_HOST=0.0.0.0 - YT_TRANSCRIPT_SERVER_PORT=8080 - YT_TRANSCRIPT_DEBUG=${DEBUG:-false} - PYTHONUNBUFFERED=1 - LOG_LEVEL=${LOG_LEVEL:-INFO} ports: - "${PORT:-8081}:8080" restart: unless-stopped healthcheck: test: ["CMD", "curl", "-X", "POST", "http://localhost:8080/mcp/", "-H", "Content-Type: application/json", "-H", "Accept: application/json, text/event-stream", "-d", '{"jsonrpc":"2.0","method":"tools/call","id":"health","params":{"name":"get_available_languages","arguments":{"video_id":"9bZkp7q19f0"}}}'] interval: 30s timeout: 10s retries: 3 # Resource limits optimized for 512MB VPS deploy: resources: limits: cpus: '0.5' memory: 256M reservations: cpus: '0.1' memory: 64M

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/suckerfish/yttranscript_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server