Skip to main content
Glama

get_job_status

Check execution status and download generated images for ComfyUI workflows using prompt IDs to monitor job progress and retrieve outputs.

Instructions

Get status and results of a ComfyUI job.

Checks execution status and optionally downloads generated images.

Args: prompt_id: The prompt ID returned by execute_workflow server_address: ComfyUI server address download_images: Whether to download generated images image_save_path: Directory to save images (relative to workflows/)

Returns: Job status with completion info and image paths if downloaded

Examples: get_job_status("12345-abcde-67890") get_job_status("12345-abcde-67890", download_images=True)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
prompt_idYes
server_addressNo127.0.0.1:8188
download_imagesNo
image_save_pathNooutputs

Implementation Reference

  • The primary @mcp.tool decorated handler function that implements get_job_status. It queries ComfyUI server for job history/queue status, parses outputs, and optionally downloads images from SaveImage nodes.
    @mcp.tool async def get_job_status( ctx: Context, prompt_id: str, server_address: str = DEFAULT_COMFYUI_SERVER, download_images: bool = False, image_save_path: str = "outputs" ) -> Dict[str, Any]: """Get status and results of a ComfyUI job. Checks execution status and optionally downloads generated images. Args: prompt_id: The prompt ID returned by execute_workflow server_address: ComfyUI server address download_images: Whether to download generated images image_save_path: Directory to save images (relative to workflows/) Returns: Job status with completion info and image paths if downloaded Examples: get_job_status("12345-abcde-67890") get_job_status("12345-abcde-67890", download_images=True) """ await ctx.info(f"Checking job status for {prompt_id}") try: client = ComfyUIClient(server_address) # Get execution history history = await client.get_history(prompt_id) if prompt_id not in history: # Check queue queue = await client.get_queue_status() # Check if still in queue for item in queue.get("queue_running", []) + queue.get("queue_pending", []): if item[1] == prompt_id: return { "prompt_id": prompt_id, "status": "running" if item in queue.get("queue_running", []) else "queued", "position": queue.get("queue_pending", []).index(item) + 1 if item in queue.get("queue_pending", []) else 0 } return { "prompt_id": prompt_id, "status": "not_found", "message": "Job not found in history or queue" } # Parse execution results execution = history[prompt_id] status = execution.get("status", {}) result = { "prompt_id": prompt_id, "status": "completed" if status.get("completed", False) and status.get("status_str") == "success" else "failed" if status.get("completed", False) else "running", "messages": status.get("messages", []) } # Extract outputs if completed if status.get("completed", False): outputs = execution.get("outputs", {}) result["outputs"] = {} for node_id, output in outputs.items(): if "images" in output: result["outputs"][node_id] = { "type": "images", "count": len(output["images"]), "images": output["images"] } if download_images and result["outputs"]: await ctx.info("Downloading generated images...") # Create save directory save_dir = Path(image_save_path) save_dir.mkdir(parents=True, exist_ok=True) downloaded_files = [] for node_id, output in result["outputs"].items(): if output["type"] == "images": for i, image_info in enumerate(output["images"]): # Download image image_data = await client.download_image( image_info["filename"], image_info["subfolder"], image_info["type"] ) # Save with descriptive name filename = f"{prompt_id}_{node_id}_{i:03d}_{image_info['filename']}" file_path = save_dir / filename file_path.write_bytes(image_data) downloaded_files.append(str(file_path)) result["downloaded_files"] = downloaded_files await ctx.info(f"✓ Downloaded {len(downloaded_files)} image(s) to {save_dir}") return result except Exception as e: raise ToolError(f"Failed to get job status: {e}")

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/christian-byrne/comfy-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server