batch_operation
Execute multiple operations concurrently with configurable settings for concurrency, error handling, and timeouts. Supports caching results and ensures robust processing in MCP workflows.
Instructions
Process multiple operations with configurable concurrency and error handling
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| cache_ttl_seconds | No | TTL for cached results | |
| concurrency | No | Maximum number of concurrent operations | |
| continue_on_error | No | Continue processing even if some operations fail | |
| operations | Yes | Array of operations to process | |
| timeout_ms | No | Timeout per operation in milliseconds | |
| use_cache | No | Cache successful results |
Implementation Reference
- src/index-v2.ts:521-638 (handler)The handler function for the batch_operation tool. It processes an array of operations with configurable concurrency, per-operation timeouts, optional caching, and continues on error if specified. Currently simulates operation execution but is structured for real operations.case "batch_operation": { const { operations, concurrency = 5, timeout_ms = 30000, continue_on_error = true, use_cache = false, cache_ttl_seconds = 300 } = args as any; const results: any[] = []; const queue = [...operations]; const inProgress = new Map<string, Promise<any>>(); // Process operations with controlled concurrency while (queue.length > 0 || inProgress.size > 0) { // Start new operations up to concurrency limit while (queue.length > 0 && inProgress.size < concurrency) { const op = queue.shift()!; // Check cache first if enabled if (use_cache) { const cacheKey = `batch:${op.type}:${JSON.stringify(op.data)}`; const cached = cache.get(cacheKey); if (cached && cached.expiresAt > Date.now()) { results.push({ id: op.id, success: true, cached: true, result: cached.value }); continue; } } // Create operation promise const promise = Promise.race([ // Simulate operation execution (async () => { // In real implementation, this would execute the actual operation await setTimeout(Math.random() * 1000); // Simulate work const result = { id: op.id, type: op.type, data: op.data, processed_at: new Date().toISOString() }; // Cache result if enabled if (use_cache) { const cacheKey = `batch:${op.type}:${JSON.stringify(op.data)}`; cache.set(cacheKey, { value: result, expiresAt: Date.now() + (cache_ttl_seconds * 1000) }); } return result; })(), // Timeout promise setTimeout(timeout_ms).then(() => { throw new Error(`Operation ${op.id} timed out`); }) ]); inProgress.set(op.id, promise); // Handle completion promise .then(result => { results.push({ id: op.id, success: true, result }); }) .catch(error => { results.push({ id: op.id, success: false, error: error.message }); if (!continue_on_error) { // Cancel remaining operations queue.length = 0; } }) .finally(() => { inProgress.delete(op.id); }); } // Wait for at least one operation to complete if (inProgress.size > 0) { await Promise.race(inProgress.values()); } } // Sort results to match input order const sortedResults = operations.map((op: any) => results.find(r => r.id === op.id) ); return { content: [{ type: "text", text: JSON.stringify({ success: true, total_operations: operations.length, successful: results.filter(r => r.success).length, failed: results.filter(r => !r.success).length, results: sortedResults }, null, 2) }] }; }
- src/index-v2.ts:179-237 (schema)Input schema for batch_operation tool defining the structure of operations array and optional parameters like concurrency, timeout, error handling, and caching options.inputSchema: { type: "object", properties: { operations: { type: "array", description: "Array of operations to process", items: { type: "object", properties: { id: { type: "string", description: "Unique identifier for this operation" }, type: { type: "string", description: "Type of operation" }, data: { type: "object", description: "Operation-specific data" } }, required: ["id", "type", "data"] }, minItems: 1, maxItems: 100 }, concurrency: { type: "number", description: "Maximum number of concurrent operations", default: 5, minimum: 1, maximum: 20 }, timeout_ms: { type: "number", description: "Timeout per operation in milliseconds", default: 30000, minimum: 1000, maximum: 300000 }, continue_on_error: { type: "boolean", description: "Continue processing even if some operations fail", default: true }, use_cache: { type: "boolean", description: "Cache successful results", default: false }, cache_ttl_seconds: { type: "number", description: "TTL for cached results", default: 300 } }, required: ["operations"] }
- src/index-v2.ts:176-238 (registration)The tool definition object for batch_operation included in the tools list returned by the listTools handler, registering the tool's name, description, and schema.{ name: "batch_operation", description: "Process multiple operations with configurable concurrency and error handling", inputSchema: { type: "object", properties: { operations: { type: "array", description: "Array of operations to process", items: { type: "object", properties: { id: { type: "string", description: "Unique identifier for this operation" }, type: { type: "string", description: "Type of operation" }, data: { type: "object", description: "Operation-specific data" } }, required: ["id", "type", "data"] }, minItems: 1, maxItems: 100 }, concurrency: { type: "number", description: "Maximum number of concurrent operations", default: 5, minimum: 1, maximum: 20 }, timeout_ms: { type: "number", description: "Timeout per operation in milliseconds", default: 30000, minimum: 1000, maximum: 300000 }, continue_on_error: { type: "boolean", description: "Continue processing even if some operations fail", default: true }, use_cache: { type: "boolean", description: "Cache successful results", default: false }, cache_ttl_seconds: { type: "number", description: "TTL for cached results", default: 300 } }, required: ["operations"] } },