Skip to main content
Glama

run_once

Manually trigger a single execution of an n8n workflow to test or run it on demand, returning detailed results.

Instructions

Execute a workflow manually once and return execution details

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
workflowIdYes
inputNo

Implementation Reference

  • src/index.ts:297-298 (registration)
    Dispatches incoming 'run_once' tool calls to the specific handleRunOnce method within the MCP server's CallToolRequest handler.
    case 'run_once': return await this.handleRunOnce(request.params.arguments as { workflowId: string | number; input?: any });
  • Defines the tool schema including name, description, and input schema (workflowId required, input optional) for the list_tools response.
    { name: 'run_once', description: 'Execute a workflow manually once and return execution details', inputSchema: { type: 'object', properties: { workflowId: { oneOf: [{ type: 'string' }, { type: 'number' }] }, input: { type: 'object' } }, required: ['workflowId'] } },
  • Main MCP tool handler: resolves workflow ID alias and delegates to N8nClient.runOnce, formats success response.
    private async handleRunOnce(args: { workflowId: string | number; input?: any }) { const workflowId = this.resolveWorkflowId(args.workflowId); const execution = await this.n8nClient.runOnce(workflowId, args.input); return { content: [{ type: 'text', text: JSON.stringify(jsonSuccess(execution), null, 2) }] };
  • N8nClient helper: fetches workflow, detects trigger nodes, chooses /executions or /workflows/{id}/execute API endpoint, returns execution details.
    async runOnce(workflowId: string | number, input?: any): Promise<N8nExecutionResponse> { try { const workflow = await this.getWorkflow(workflowId); const hasTriggerNodes = workflow.nodes.some( (node) => node.type === 'n8n-nodes-base.webhook' || node.type === 'n8n-nodes-base.cron' || node.type.includes('trigger'), ); if (hasTriggerNodes) { const executionData = { workflowData: workflow, runData: input || {} }; const response = await this.api.post<N8nApiResponse<any>>('/executions', executionData); return { executionId: response.data.data.id || response.data.data.executionId, status: response.data.data.status || 'running' }; } else { const response = await this.api.post<N8nApiResponse<any>>(`/workflows/${workflowId}/execute`, { data: input || {} }); return { executionId: response.data.data.id || response.data.data.executionId, status: response.data.data.status || 'running' }; } } catch (error: any) { if (error instanceof Error && error.message.includes('404')) { throw new Error(`Workflow ${workflowId} not found or cannot be executed manually`); } throw error; } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/get2knowio/n8n-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server