Skip to main content
Glama

get_generation

Check the status of a video or image generation created with Luma AI's Dream Machine.

Instructions

Gets the status of a generation

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
generation_idYes

Implementation Reference

  • The main handler function that executes the get_generation tool logic: fetches generation status from Luma API using the provided generation_id and formats the response.
    async def get_generation(parameters: dict) -> str: """Get the status of a generation.""" try: generation_id = parameters.get("generation_id") if not generation_id: raise ValueError("generation_id parameter is required") result = await _make_luma_request("GET", f"/generations/{generation_id}") if not isinstance(result, dict): raise ValueError("Invalid response from API") output = [f"Generation ID: {result['id']}", f"State: {result['state']}"] if result.get("failure_reason"): output.append(f"Reason: {result['failure_reason']}") if result.get("assets", {}).get("video"): output.append(f"Video URL: {result['assets']['video']}") return "\n".join(output) except Exception as e: logger.error(f"Error in get_generation: {str(e)}", exc_info=True) return f"Error getting generation {generation_id}: {str(e)}"
  • Pydantic input schema for the get_generation tool, defining the required 'generation_id' field.
    class GetGenerationInput(BaseModel): generation_id: str
  • Tool registration in the MCP server's list_tools() method, defining name, description, and input schema for get_generation.
    Tool( name=LumaTools.GET_GENERATION, description="Gets the status of a generation", inputSchema=GetGenerationInput.model_json_schema(), ),
  • Dispatch/handling of get_generation tool call in the MCP server's call_tool() method.
    case LumaTools.GET_GENERATION: result = await get_generation(arguments) return [TextContent(type="text", text=result)]

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bobtista/luma-ai-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server