Skip to main content
Glama

n8n-workflow-builder-mcp

by ifmelate
readWriteFile.json1.39 kB
{ "nodeType": "n8n-nodes-base.readWriteFile", "displayName": "Read/Write Files from Disk", "description": "Read or write files from the computer that runs n8n", "version": 1, "properties": [ { "name": "info", "displayName": "Use this node to read and write files on the same computer running n8n. To handle files between different computers please use other nodes (e.g. FTP, HTTP Request, AWS).", "type": "notice", "default": "" }, { "name": "operation", "displayName": "Operation", "type": "options", "default": "read", "description": "Retrieve one or more files from the computer that runs n8n", "options": [ { "name": "Read File(s) From Disk", "value": "read", "description": "Retrieve one or more files from the computer that runs n8n" }, { "name": "Write File to Disk", "value": "write", "description": "Create a binary file on the computer that runs n8n" } ] } ], "credentialsConfig": [], "io": { "inputs": [ "Main" ], "outputs": [ "Main" ], "outputNames": [], "hints": {} }, "wiring": { "role": "generic", "requires": [], "optional": [], "consumedBy": [], "consumes": [ "Main" ], "produces": [ "Main" ] } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ifmelate/n8n-workflow-builder-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server