Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
langchain_chat.json1.39 kB
{ "nodeType": "@n8n/n8n-nodes-langchain.chat", "displayName": "Respond to Chat", "description": "Send a message to a chat", "version": 1, "properties": [ { "name": "generalNotice", "displayName": "Verify you're using a chat trigger with the 'Response Mode' option set to 'Using Response Nodes'", "type": "notice", "default": "" }, { "name": "message", "displayName": "Message", "type": "string", "default": "", "required": true, "typeOptions": { "rows": 6 } }, { "displayName": "Wait for User Reply", "type": "boolean", "default": true }, { "name": "options", "displayName": "Options", "type": "collection", "default": {}, "placeholder": "Add Option", "options": [ { "name": "memoryConnection", "displayName": "Add Memory Input Connection", "type": "boolean", "default": false } ] } ], "credentialsConfig": [], "io": { "inputs": [ "AiMemory" ], "outputs": [ "Main" ], "outputNames": [], "hints": {} }, "wiring": { "role": "chatTrigger", "requires": [], "optional": [ "AiMemory" ], "consumedBy": [], "consumes": [ "AiMemory" ], "produces": [ "Main" ] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server