Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
webhook.json3.17 kB
{ "nodeType": "n8n-nodes-base.webhook", "displayName": "Webhook", "description": "Starts the workflow when a webhook is called", "version": [ 1, 1.1, 2, 2.1 ], "properties": [ { "name": "multipleMethods", "displayName": "Allow Multiple HTTP Methods", "type": "boolean", "default": false, "description": "Whether to allow the webhook to listen for multiple HTTP methods" }, { "name": "httpMethod", "displayName": "HTTP Methods", "type": "multiOptions", "default": "['GET', 'POST']", "description": "The HTTP methods to listen to", "options": [ { "name": "DELETE", "value": "DELETE" }, { "name": "GET", "value": "GET" }, { "name": "HEAD", "value": "HEAD" }, { "name": "PATCH", "value": "PATCH" }, { "name": "POST", "value": "POST" }, { "name": "PUT", "value": "PUT" } ], "displayOptions": { "show": { "multipleMethods": [ true ] } } }, { "name": "path", "displayName": "Path", "type": "string", "default": "", "description": "The path to listen to, dynamic values could be specified by using ':', e.g. 'your-path/:dynamic-value'. If dynamic values are set 'webhookId' would be prepended to path.", "placeholder": "webhook" }, { "name": "webhookNotice", "displayName": "Insert a 'Respond to Webhook' node to control when and how you respond. <a href=\"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook/\" target=\"_blank\">More details</a>", "type": "notice", "default": "", "displayOptions": { "show": { "responseMode": [ "responseNode" ] } } }, { "name": "webhookStreamingNotice", "displayName": "Insert a node that supports streaming (e.g. 'AI Agent') and enable streaming to stream directly to the response while the workflow is executed. <a href=\"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook/\" target=\"_blank\">More details</a>", "type": "notice", "default": "", "displayOptions": { "show": { "responseMode": [ "streaming" ] } } }, { "name": "contentTypeNotice", "displayName": "If you are sending back a response, add a \"Content-Type\" response header with the appropriate value to avoid unexpected behavior", "type": "notice", "default": "", "displayOptions": { "show": { "responseMode": [ "onReceived" ] } } } ], "credentialsConfig": [], "io": { "inputs": [], "outputs": [], "outputNames": [], "hints": {} }, "wiring": { "role": "generic", "requires": [], "optional": [], "consumedBy": [], "consumes": [], "produces": [] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server