generate_music_video
Describe a genre, mood, or theme to produce an original music track and matching video. The AI composes music and visuals that align with your input.
Instructions
Generate a music video — AI music + video combined.
Creates both an original music track and a matching video in one go. The AI composes music and generates visuals that match the mood. Cost: ~364 credits (music + video).
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | Description of the music video (genre, mood, theme) | |
| style | No | Music genre (pop, rock, electronic, classical, lo-fi, ambient) | pop |
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- src/yaparai/tools/generate.py:154-188 (handler)The actual handler function for the generate_music_video tool. It takes a prompt and style (pop/rock/electronic/classical/lo-fi/ambient), sends a request to the YaparAI API with mode 'suno_music_video', waits for the result, and returns a dict with video_url, job_id, credits_used, and balance_remaining.
async def generate_music_video( prompt: str, style: Literal["pop", "rock", "electronic", "classical", "lo-fi", "ambient"] = "pop", ) -> dict: """ Generate a music video — AI music + video combined. Creates both an original music track and a matching video in one go. The AI composes music and generates visuals that match the mood. Cost: ~364 credits (music + video). Args: prompt: Description of the music video (genre, mood, theme) style: Music genre (pop, rock, electronic, classical, lo-fi, ambient) Returns: Dict with video_url, job_id, credits_used, and balance_remaining. """ client = YaparAIClient() full_prompt = f"[{style}] {prompt}" if style else prompt job = await client.generate({ "type": "music", "prompt": full_prompt, "mode": "suno_music_video", }) result = await client.wait_for_result(job["job_id"], timeout=180) return { "status": "success", "video_url": result.get("result_url"), "job_id": result.get("job_id"), "credits_used": job.get("credits_used"), "balance_remaining": job.get("balance_remaining"), } - src/yaparai/server.py:126-126 (registration)Registration of the generate_music_video tool with the FastMCP server via mcp.tool(generate_music_video).
mcp.tool(generate_music_video) - src/yaparai/server.py:24-29 (helper)Import of generate_music_video from yaparai.tools.generate into the server module.
from yaparai.tools.generate import ( generate_image, generate_video, generate_music, generate_music_video, ) - src/yaparai/client.py:126-128 (helper)The YaparAIClient.generate() method used by the handler to start the generation job.
async def generate(self, request: dict) -> dict: """Start a generation job.""" return await self._request("POST", "/v1/public/generate", json=request) - src/yaparai/client.py:142-163 (helper)The YaparAIClient.wait_for_result() method used by the handler to poll for job completion.
async def wait_for_result( self, job_id: str, timeout: int = 120, poll_interval: int = 3, ) -> dict: """Poll job status until completed or timeout.""" elapsed = 0 while elapsed < timeout: job = await self.get_job(job_id) status = job.get("status", "") if status == "succeeded": return job if status == "failed": error = job.get("error_message") or job.get("error") or "Unknown error" raise RuntimeError(f"Generation failed: {error}") await asyncio.sleep(poll_interval) elapsed += poll_interval raise TimeoutError( f"Job {job_id} is still processing after {timeout}s. " f"Use get_job_status('{job_id}') to check later." )