Skip to main content
Glama

execute

Execute and test n8n workflows using workflow ID or file path with input data, providing a reliable alternative to manual bash commands for workflow execution.

Instructions

Execute/test an n8n workflow - DO NOT use bash n8n commands, use this tool instead

Input Schema

NameRequiredDescriptionDefault
dataNoInput data for the workflow
fileNoPath to workflow file to execute
idNoWorkflow ID to execute

Input Schema (JSON Schema)

{ "properties": { "data": { "description": "Input data for the workflow", "type": "object" }, "file": { "description": "Path to workflow file to execute", "type": "string" }, "id": { "description": "Workflow ID to execute", "type": "string" } }, "type": "object" }

Implementation Reference

  • Registration of the 'execute' tool including name, description, and input schema definition in getToolDefinitions()
    { name: 'execute', description: 'Execute/test an n8n workflow - DO NOT use bash n8n commands, use this tool instead', inputSchema: { type: 'object', properties: { id: { type: 'string', description: 'Workflow ID to execute', }, file: { type: 'string', description: 'Path to workflow file to execute', }, data: { type: 'object', description: 'Input data for the workflow', }, }, }, },
  • Dispatch handler in ToolHandler.handleTool() that routes 'execute' tool calls to N8nManager.executeWorkflow
    case 'execute': return await this.n8nManager.executeWorkflow({ id: args?.id as string, file: args?.file as string, data: args?.data as any, });
  • Core implementation of execute tool in N8nManager.executeWorkflow(): builds and executes 'n8n execute' CLI command with workflow ID/file and optional input data, handles output and errors
    async executeWorkflow(options: { id?: string; file?: string; data?: any; } = {}): Promise<any> { try { // Build command let command = 'n8n execute'; if (options.id) { command += ` --id=${options.id}`; } else if (options.file) { const fullPath = path.join(this.workflowsPath, options.file); command += ` --file="${fullPath}"`; } else { throw new Error('Either id or file must be specified'); } // Add input data if provided if (options.data) { const dataFile = `/tmp/n8n-input-${Date.now()}.json`; await fs.writeFile(dataFile, JSON.stringify(options.data)); command += ` --input="${dataFile}"`; } console.error(`Executing: ${command}`); const { stdout, stderr } = await execAsync(command, { timeout: 60000, // 60 second timeout }); // Clean up temp file if created if (options.data) { const dataFile = `/tmp/n8n-input-${Date.now()}.json`; await fs.unlink(dataFile).catch(() => {}); } if (this.hasRealError(stderr, stdout)) { throw new Error(stderr); } // Parse execution results if possible let result = stdout; try { result = JSON.parse(stdout); } catch { // Not JSON, use as-is } return { content: [ { type: 'text', text: `βœ… Workflow executed successfully!\n\n` + `${options.id ? `πŸ†” Workflow ID: ${options.id}\n` : ''}` + `${options.file ? `πŸ“ File: ${options.file}\n` : ''}` + `\nπŸ“Š Results:\n${typeof result === 'object' ? JSON.stringify(result, null, 2) : result}`, }, ], }; } catch (error: any) { throw new Error(`Failed to execute workflow: ${error.message}`); } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mckinleymedia/mcflow-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server