Skip to main content
Glama

list_comfyui_queue

View running and pending jobs in the ComfyUI execution queue to monitor workflow processing status and manage job priorities.

Instructions

List current ComfyUI execution queue.

Shows running and pending jobs in the ComfyUI queue.

Args: server_address: ComfyUI server address

Returns: Queue information with running and pending jobs

Examples: list_comfyui_queue() list_comfyui_queue("192.168.1.100:8188")

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
server_addressNo127.0.0.1:8188

Implementation Reference

  • The primary handler function for the 'list_comfyui_queue' MCP tool. It creates a ComfyUIClient instance, fetches the queue status via API, formats the running and pending jobs, and returns a structured result.
    @mcp.tool
    async def list_comfyui_queue(
        ctx: Context,
        server_address: str = DEFAULT_COMFYUI_SERVER
    ) -> Dict[str, Any]:
        """List current ComfyUI execution queue.
        
        Shows running and pending jobs in the ComfyUI queue.
        
        Args:
            server_address: ComfyUI server address
        
        Returns:
            Queue information with running and pending jobs
        
        Examples:
            list_comfyui_queue()
            list_comfyui_queue("192.168.1.100:8188")
        """
        await ctx.info(f"Fetching queue status from {server_address}")
        
        try:
            client = ComfyUIClient(server_address)
            queue = await client.get_queue_status()
            
            result = {
                "server_address": server_address,
                "queue_running": len(queue.get("queue_running", [])),
                "queue_pending": len(queue.get("queue_pending", [])),
                "running_jobs": [],
                "pending_jobs": []
            }
            
            # Format running jobs
            for item in queue.get("queue_running", []):
                result["running_jobs"].append({
                    "prompt_id": item[1],
                    "submitted_at": item[2],
                    "position": 0  # Currently running
                })
            
            # Format pending jobs
            for i, item in enumerate(queue.get("queue_pending", [])):
                result["pending_jobs"].append({
                    "prompt_id": item[1],
                    "submitted_at": item[2],
                    "position": i + 1
                })
            
            await ctx.info(f"✓ Queue: {result['queue_running']} running, {result['queue_pending']} pending")
            return result
            
        except Exception as e:
            raise ToolError(f"Failed to get queue status: {e}")
  • The ComfyUIClient helper method that performs the HTTP GET request to the ComfyUI /queue endpoint to retrieve the raw queue data used by the tool handler.
    async def get_queue_status(self) -> Dict[str, Any]:
        """Get current queue status"""
        async with httpx.AsyncClient() as client:
            response = await client.get(f"{self.base_url}/queue")
            response.raise_for_status()
            return response.json()
  • Documentation in __init__.py listing the available tools, including list_comfyui_queue.
    - list_comfyui_queue: View ComfyUI queue status

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/christian-byrne/comfy-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server