Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
splitInBatches.json1.52 kB
{ "nodeType": "n8n-nodes-base.splitInBatches", "displayName": "SplitInBatches", "description": null, "version": "3", "properties": [ { "name": "splitInBatchesNotice", "displayName": "You may not need this node — n8n nodes automatically run once for each input item. <a href=\"https://docs.n8n.io/getting-started/key-concepts/looping.html#using-loops-in-n8n\" target=\"_blank\">More info</a>", "type": "notice", "default": "" }, { "name": "batchSize", "displayName": "Batch Size", "type": "number", "default": 1, "description": "The number of items to return with each call", "typeOptions": { "minValue": 1 } }, { "name": "options", "displayName": "Options", "type": "collection", "default": {}, "description": "Whether the node will be reset and so with the current input-data newly initialized", "placeholder": "Add option", "options": [ { "name": "reset", "displayName": "Reset", "type": "boolean", "default": false, "description": "Whether the node will be reset and so with the current input-data newly initialized" } ] } ], "credentialsConfig": [], "io": { "inputs": [], "outputs": [], "outputNames": [], "hints": {} }, "wiring": { "role": "generic", "requires": [], "optional": [], "consumedBy": [], "consumes": [], "produces": [] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server