Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
langchain_chainLlm.json2.48 kB
{ "nodeType": "@n8n/n8n-nodes-langchain.chainLlm", "displayName": "Basic LLM Chain", "description": "A simple chain to prompt a large language model", "version": [ 1, 1.1, 1.2, 1.3, 1.4, 1.5 ], "properties": [ { "name": "prompt", "displayName": "Prompt", "type": "string", "default": "={{ $json.input }}", "required": true, "displayOptions": { "show": { "@version": [ 1 ] } } }, { "name": "text", "displayName": "Text", "type": "string", "default": "", "placeholder": "e.g. Hello, how can you help me?", "required": true, "typeOptions": { "rows": 2 }, "displayOptions": { "show": { "promptType": [ "define" ] } } }, { "name": "hasOutputParser", "displayName": "Require Specific Output Format", "type": "boolean", "default": false, "displayOptions": { "hide": { "@version": [ 1, 1.1, 1.3 ] } } }, { "name": "messages", "displayName": "Chat Messages (if Using a Chat Model)", "type": "fixedCollection", "default": {}, "description": "Simple text message", "placeholder": "Add prompt", "required": true, "typeOptions": { "multipleValues": true }, "displayOptions": { "show": { "type": [ "HumanMessagePromptTemplate.lc_name()" ] } } }, { "name": "notice", "type": "notice", "default": "", "displayOptions": { "show": { "hasOutputParser": [ true ] } } } ], "credentialsConfig": [ { "name": "prompt", "required": true }, { "name": "text", "required": true }, { "name": "messageType", "required": false }, { "name": "binaryImageDataKey", "required": true }, { "name": "message", "required": true }, { "name": "notice", "required": false } ], "io": { "inputs": [], "outputs": [], "outputNames": [], "hints": {} }, "wiring": { "role": "generic", "requires": [], "optional": [], "consumedBy": [], "consumes": [], "produces": [] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server