batch_update_blocks
Update multiple blocks simultaneously in Mnemosyne knowledge graphs to modify attributes or XML content with a single transaction, improving efficiency over individual updates.
Instructions
Update multiple blocks in a single transaction. More efficient than individual update_block calls. Each update can specify attributes to change and/or new XML content. Returns results for each update.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| graph_id | Yes | ||
| document_id | Yes | ||
| updates | Yes |
Implementation Reference
- The handler function `batch_update_blocks_tool` that executes the core logic for batch updating blocks in a document. It authenticates, validates inputs, connects to the document, performs atomic updates using a transaction via `hp_client.transact_document`, and returns detailed results including success status and counts.async def batch_update_blocks_tool( graph_id: str, document_id: str, updates: list[Dict[str, Any]], context: Context | None = None, ) -> dict: """Batch update multiple blocks atomically. Args: graph_id: The graph containing the document document_id: The document containing the blocks updates: List of update specs, each with: - block_id (required): The block to update - attributes (optional): Dict of attributes to update - content (optional): New XML content for the block """ auth = MCPAuthContext.from_context(context) auth.require_auth() if not graph_id or not graph_id.strip(): raise ValueError("graph_id is required") if not document_id or not document_id.strip(): raise ValueError("document_id is required") if not updates: raise ValueError("updates list is required and cannot be empty") try: await hp_client.connect_document(graph_id.strip(), document_id.strip()) results: list[Dict[str, Any]] = [] def perform_batch(doc: Any) -> None: writer = DocumentWriter(doc) for update in updates: block_id = update.get("block_id") if not block_id: results.append({"error": "missing block_id"}) continue try: if "content" in update: writer.replace_block_by_id(block_id, update["content"]) if "attributes" in update: writer.update_block_attributes(block_id, update["attributes"]) results.append({"block_id": block_id, "success": True}) except Exception as e: results.append({"block_id": block_id, "error": str(e)}) await hp_client.transact_document( graph_id.strip(), document_id.strip(), perform_batch, ) return { "success": all(r.get("success") for r in results), "graph_id": graph_id.strip(), "document_id": document_id.strip(), "results": results, "updated_count": sum(1 for r in results if r.get("success")), "error_count": sum(1 for r in results if "error" in r), } except Exception as e: logger.error( "Failed to batch update blocks", extra_context={ "graph_id": graph_id, "document_id": document_id, "update_count": len(updates), "error": str(e), }, ) raise RuntimeError(f"Failed to batch update blocks: {e}")
- src/neem/mcp/tools/hocuspocus.py:1229-1237 (registration)The registration of the `batch_update_blocks` tool using the `@server.tool` decorator, which defines the tool's name, title, and description for the MCP server.@server.tool( name="batch_update_blocks", title="Batch Update Blocks", description=( "Update multiple blocks in a single transaction. More efficient than " "individual update_block calls. Each update can specify attributes to change " "and/or new XML content. Returns results for each update." ), )
- Input schema defined by function parameters and docstring, specifying the structure for graph_id, document_id, updates list (with block_id required, attributes/content optional), and optional context.graph_id: str, document_id: str, updates: list[Dict[str, Any]], context: Context | None = None, ) -> dict: """Batch update multiple blocks atomically. Args: graph_id: The graph containing the document document_id: The document containing the blocks updates: List of update specs, each with: - block_id (required): The block to update - attributes (optional): Dict of attributes to update - content (optional): New XML content for the block """