Skip to main content
Glama
Dockerfile676 B
FROM python:3.11-slim WORKDIR /app # Install system dependencies RUN apt-get update && apt-get install -y \ ffmpeg \ && rm -rf /var/lib/apt/lists/* # Install yt-dlp (latest version) RUN pip install --no-cache-dir yt-dlp # Copy and install Python package COPY pyproject.toml . COPY src/ ./src/ RUN pip install --no-cache-dir . # Create directories for volumes RUN mkdir -p /videos /tmp/loom-frames # Set environment variables ENV LOOM_VIDEOS_DIR=/videos ENV LOOM_FRAMES_DIR=/tmp/loom-frames # Keep container running - MCP server is started via docker exec # when Claude Code connects (stdio transport requires on-demand startup) CMD ["tail", "-f", "/dev/null"]

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Slaycaster/loom-local-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server