break_down_task
Decomposes large tasks into manageable subtasks using Claude AI, then automatically stores them in the database for organized execution.
Instructions
Use LLM to decompose large task into smaller subtasks.
This returns a prompt for Claude to generate subtasks. After Claude responds, call this tool again with the response to create the subtasks in the database.
Workflow:
Call break_down_task(todo_id=X) -> Returns prompt
Send prompt to Claude
Claude returns JSON with subtasks
Subtasks are automatically created in database
Args: todo_id: ID of the task to break down subtask_count: Target number of subtasks (default: 5)
Returns: Prompt for Claude to generate subtasks, or confirmation if creating them
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| todo_id | Yes | ||
| subtask_count | No |
Implementation Reference
- src/coach_ai/server.py:501-544 (handler)The `break_down_task` function, decorated with `@mcp.tool()`, implements the tool handler. It fetches the task from the database and uses `break_down_task_with_claude` to generate a prompt for task decomposition.
async def break_down_task(todo_id: int, subtask_count: int = 5) -> str: """Use LLM to decompose large task into smaller subtasks. This returns a prompt for Claude to generate subtasks. After Claude responds, call this tool again with the response to create the subtasks in the database. Workflow: 1. Call break_down_task(todo_id=X) -> Returns prompt 2. Send prompt to Claude 3. Claude returns JSON with subtasks 4. Subtasks are automatically created in database Args: todo_id: ID of the task to break down subtask_count: Target number of subtasks (default: 5) Returns: Prompt for Claude to generate subtasks, or confirmation if creating them """ db = await storage.get_db() # Get todo cursor = await db.execute( """ SELECT id, title, priority, notes, timeframe, theme_tag, time_estimate FROM todos WHERE id = ? """, (todo_id,), ) row = await cursor.fetchone() if not row: return f"Error: Todo #{todo_id} not found" todo = dict(row) # Generate prompt for Claude breakdown = break_down_task_with_claude(todo, subtask_count) response = f"**Breaking down task #{todo_id}: {todo['title']}**\n\n" response += "I'll help break this down into smaller steps. Here's what I recommend:\n\n" response += breakdown['prompt'] response += "\n\n*Note: This is a prompt for planning. The subtasks will be created automatically based on the breakdown.*" return response - src/coach_ai/task_breakdown.py:14-30 (helper)The `break_down_task_with_claude` helper function generates the prompt for an LLM to decompose a task into subtasks.
async def break_down_task_with_claude( todo: dict[str, Any], subtask_count: int = 5, ) -> dict[str, Any]: """ Use Claude (via prompt) to break down a large task into smaller subtasks This function returns a prompt that should be sent to Claude. The caller (MCP tool) will handle the actual LLM call. Args: todo: The todo dict to break down subtask_count: Target number of subtasks Returns: Dict with prompt for LLM """