Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
langchain_chainSummarization.json3.28 kB
{ "nodeType": "@n8n/n8n-nodes-langchain.chainSummarization", "displayName": "ChainSummarization", "description": null, "version": "2", "properties": [ { "name": "operationMode", "displayName": "Data to Summarize", "type": "options", "default": "nodeInputJson", "description": "How to pass data into the summarization chain", "options": [ { "name": "Use Node Input (JSON)", "value": "nodeInputJson", "description": "Summarize the JSON data coming into this node from the previous one" }, { "name": "Use Node Input (Binary)", "value": "nodeInputBinary", "description": "Summarize the binary data coming into this node from the previous one" }, { "name": "Use Document Loader", "value": "documentLoader", "description": "Use a loader sub-node with more configuration options" } ] }, { "name": "chunkingMode", "displayName": "Chunking Strategy", "type": "options", "default": "simple", "description": "Chunk splitting strategy", "options": [ { "name": "Simple (Define Below)", "value": "simple" }, { "name": "Advanced", "value": "advanced", "description": "Use a splitter sub-node with more configuration options" } ], "displayOptions": { "show": { "/operationMode": [ "nodeInputJson", "nodeInputBinary" ] } } }, { "name": "chunkSize", "displayName": "Characters Per Chunk", "type": "number", "default": 1000, "description": "Controls the max size (in terms of number of characters) of the final document chunk", "displayOptions": { "show": { "/chunkingMode": [ "simple" ] } } }, { "name": "chunkOverlap", "displayName": "Chunk Overlap (Characters)", "type": "number", "default": 200, "description": "Specifies how much characters overlap there should be between chunks", "displayOptions": { "show": { "/chunkingMode": [ "simple" ] } } }, { "name": "options", "displayName": "Options", "type": "collection", "default": {}, "description": "The name of the field in the agent or chain’s input that contains the binary file to be processed", "placeholder": "Add Option", "displayOptions": { "show": { "/operationMode": [ "nodeInputBinary" ] } } } ], "credentialsConfig": [ { "name": "operationMode", "required": false }, { "name": "chunkingMode", "required": false }, { "name": "chunkSize", "required": false }, { "name": "combineMapPrompt", "required": false } ], "io": { "inputs": [], "outputs": [], "outputNames": [], "hints": {} }, "wiring": { "role": "generic", "requires": [], "optional": [], "consumedBy": [], "consumes": [], "produces": [] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server