Skip to main content
Glama

break_down_task

Decomposes large tasks into manageable subtasks using Claude AI, then automatically stores them in the database for organized execution.

Instructions

Use LLM to decompose large task into smaller subtasks.

This returns a prompt for Claude to generate subtasks. After Claude responds, call this tool again with the response to create the subtasks in the database.

Workflow:

  1. Call break_down_task(todo_id=X) -> Returns prompt

  2. Send prompt to Claude

  3. Claude returns JSON with subtasks

  4. Subtasks are automatically created in database

Args: todo_id: ID of the task to break down subtask_count: Target number of subtasks (default: 5)

Returns: Prompt for Claude to generate subtasks, or confirmation if creating them

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
todo_idYes
subtask_countNo

Implementation Reference

  • The `break_down_task` function, decorated with `@mcp.tool()`, implements the tool handler. It fetches the task from the database and uses `break_down_task_with_claude` to generate a prompt for task decomposition.
    async def break_down_task(todo_id: int, subtask_count: int = 5) -> str:
        """Use LLM to decompose large task into smaller subtasks.
    
        This returns a prompt for Claude to generate subtasks. After Claude responds,
        call this tool again with the response to create the subtasks in the database.
    
        Workflow:
        1. Call break_down_task(todo_id=X) -> Returns prompt
        2. Send prompt to Claude
        3. Claude returns JSON with subtasks
        4. Subtasks are automatically created in database
    
        Args:
            todo_id: ID of the task to break down
            subtask_count: Target number of subtasks (default: 5)
    
        Returns:
            Prompt for Claude to generate subtasks, or confirmation if creating them
        """
        db = await storage.get_db()
    
        # Get todo
        cursor = await db.execute(
            """
            SELECT id, title, priority, notes, timeframe, theme_tag, time_estimate
            FROM todos WHERE id = ?
            """,
            (todo_id,),
        )
        row = await cursor.fetchone()
        if not row:
            return f"Error: Todo #{todo_id} not found"
    
        todo = dict(row)
    
        # Generate prompt for Claude
        breakdown = break_down_task_with_claude(todo, subtask_count)
    
        response = f"**Breaking down task #{todo_id}: {todo['title']}**\n\n"
        response += "I'll help break this down into smaller steps. Here's what I recommend:\n\n"
        response += breakdown['prompt']
        response += "\n\n*Note: This is a prompt for planning. The subtasks will be created automatically based on the breakdown.*"
    
        return response
  • The `break_down_task_with_claude` helper function generates the prompt for an LLM to decompose a task into subtasks.
    async def break_down_task_with_claude(
        todo: dict[str, Any],
        subtask_count: int = 5,
    ) -> dict[str, Any]:
        """
        Use Claude (via prompt) to break down a large task into smaller subtasks
    
        This function returns a prompt that should be sent to Claude.
        The caller (MCP tool) will handle the actual LLM call.
    
        Args:
            todo: The todo dict to break down
            subtask_count: Target number of subtasks
    
        Returns:
            Dict with prompt for LLM
        """

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/94aharris/coach-ai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server