vidlizer
by arizawan
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| PROVIDER | Yes | Provider to use: ollama, openai, openrouter | |
| BATCH_SIZE | No | Frames per API call (0 = auto, defaults to 1) | |
| MAX_FRAMES | No | Max frames to send (default 60, hard cap 200) | |
| FRAME_WIDTH | No | Frame width in pixels (default 512) | |
| OLLAMA_HOST | No | Ollama server URL (default: http://localhost:11434) | |
| MAX_COST_USD | No | Abort if spend exceeds this (default 1.00) | |
| MIN_INTERVAL | No | Minimum seconds between frames (default 2) | |
| OLLAMA_MODEL | No | Ollama model name (e.g., qwen2.5vl:3b) | |
| OPENAI_MODEL | No | Model ID as shown in server (required) | |
| FALLBACK_MODEL | No | Model ID for fallback | |
| OPENAI_API_KEY | No | API key for OpenAI-compatible server (e.g., lm-studio or not-needed) | |
| OPENAI_BASE_URL | No | Base URL for OpenAI-compatible server (e.g., http://localhost:1234/v1) | |
| REQUEST_TIMEOUT | No | Per-request timeout (default 600) | |
| SCENE_THRESHOLD | No | Scene-change sensitivity 0–1 (default 0.1) | |
| FALLBACK_API_KEY | No | API key for fallback provider | |
| OPENROUTER_MODEL | No | Model slug for OpenRouter (e.g., google/gemini-2.5-flash) | |
| FALLBACK_BASE_URL | No | Base URL for fallback server | |
| FALLBACK_PROVIDER | No | Provider for fallback (ollama/openai/openrouter) | |
| OPENROUTER_API_KEY | No | OpenRouter API key |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/arizawan/vidlizer'
If you have feedback or need assistance with the MCP directory API, please join our Discord server