Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
DEBUGNoEnable debug logging0
FFMPEG_PATHNoPath to ffmpeg binaryffmpeg
YT_DLP_PATHNoPath to yt-dlp binaryyt-dlp
OPENAI_API_KEYNoOpenAI API key for Whisper-based subtitle generation
WHISPER_MODEL_PATHNoPath to whisper model (for local whisper)Auto-download
WHISPER_BINARY_PATHNoPath to local whisper binarywhisper
VIDEO_TOOLKIT_STORAGE_DIRNoDefault directory for downloaded videos~/.video-toolkit/downloads
VIDEO_TOOLKIT_WHISPER_ENGINENoPreferred whisper engine: openai, local, or autoauto

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/JamesANZ/video-toolkit-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server