Skip to main content
Glama

Add Dataflow Activity to Pipeline

add_dataflow_activity_to_pipeline

Add a Dataflow Activity to an existing Microsoft Fabric pipeline to extend data processing capabilities and build complex workflows incrementally.

Instructions

Add a Dataflow Activity to an existing Fabric pipeline.

Retrieves an existing pipeline, adds a Dataflow Activity to it, and updates the pipeline definition. The Dataflow Activity will be appended to any existing activities in the pipeline.

Use this tool when:

  • You have an existing pipeline and want to add a new Dataflow Activity

  • You're building complex pipelines with multiple activities

  • You want to incrementally build a pipeline

Parameters: workspace_name: The display name of the workspace containing the pipeline. pipeline_name: Name of the existing pipeline to update. dataflow_name: Name of the Dataflow to run. dataflow_workspace_name: Optional name of the workspace containing the Dataflow. activity_name: Optional custom name for the activity (default: auto-generated). depends_on_activity_name: Optional name of an existing activity this one depends on. timeout: Activity timeout (default: "0.12:00:00"). retry: Number of retry attempts (default: 0). retry_interval_seconds: Retry interval in seconds (default: 30).

Returns: Dictionary with status, pipeline_id, pipeline_name, activity_name, workspace_name, and message.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
workspace_nameYes
pipeline_nameYes
dataflow_nameYes
dataflow_workspace_nameNo
activity_nameNo
depends_on_activity_nameNo
timeoutNo0.12:00:00
retryNo
retry_interval_secondsNo

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Since no annotations are provided, the description carries the full burden of behavioral disclosure. It effectively describes the tool's behavior: retrieving an existing pipeline, adding a Dataflow Activity, updating the pipeline definition, and appending to existing activities. It also mentions default values for parameters like timeout and retry settings. However, it doesn't cover potential side effects, error conditions, or authentication requirements.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is well-structured with clear sections: purpose statement, behavioral explanation, usage guidelines, parameter details, and return values. Every sentence adds value - the first paragraph explains what the tool does, the bullet points provide usage context, and the parameter section adds crucial information missing from the schema.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness5/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the complexity of a 9-parameter mutation tool with no annotations and 0% schema description coverage, the description provides comprehensive coverage. It explains the tool's purpose, when to use it, detailed parameter semantics, and includes return value information. With an output schema present, the description appropriately focuses on the tool's behavior and inputs rather than output details.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

With 0% schema description coverage, the description fully compensates by providing detailed explanations for all 9 parameters. It clarifies optional vs. required parameters, explains what each parameter represents (e.g., 'display name of the workspace,' 'name of the existing pipeline'), and provides default values for optional parameters like timeout and retry settings.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the specific action ('Add a Dataflow Activity to an existing Fabric pipeline') and distinguishes it from sibling tools like 'add_copy_activity_to_pipeline' and 'add_notebook_activity_to_pipeline' by specifying it's for Dataflow Activities. It explicitly mentions the resource ('existing Fabric pipeline') and the verb ('add').

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines5/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description includes an explicit 'Use this tool when:' section with three bullet points that clearly define the appropriate contexts for using this tool, such as when you have an existing pipeline, are building complex pipelines, or want incremental pipeline building. This provides clear guidance on when to select this tool over alternatives.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bablulawrence/ms-fabric-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server