add_copy_activity_to_pipeline
Add a data copy operation to an existing Microsoft Fabric pipeline to transfer data from source databases to a Lakehouse destination.
Instructions
Add a Copy Activity to an existing Fabric pipeline.
Retrieves an existing pipeline, adds a Copy Activity to it, and updates the pipeline definition. The Copy Activity will be appended to any existing activities in the pipeline.
Use this tool when:
You have an existing pipeline and want to add a new Copy Activity
You're building complex pipelines with multiple data copy operations
You want to incrementally build a pipeline
Parameters: workspace_name: The display name of the workspace containing the pipeline. pipeline_name: Name of the existing pipeline to update. source_type: Type of source (e.g., "AzurePostgreSqlSource", "AzureSqlSource", "SqlServerSource"). source_connection_id: Fabric workspace connection ID for source database. source_table_schema: Schema name of the source table (e.g., "public", "dbo"). source_table_name: Name of the source table (e.g., "movie"). destination_lakehouse_id: Workspace artifact ID of the destination Lakehouse. destination_connection_id: Fabric workspace connection ID for destination Lakehouse. destination_table_name: Name for the destination table in Lakehouse. activity_name: Optional custom name for the activity (default: auto-generated). source_access_mode: Source access mode ("direct" or "sql"). Default is "direct". source_sql_query: Optional SQL query for sql access mode. table_action_option: Table action option (default: "Append", options: "Append", "Overwrite"). apply_v_order: Apply V-Order optimization (default: True). timeout: Activity timeout (default: "0.12:00:00"). retry: Number of retry attempts (default: 0). retry_interval_seconds: Retry interval in seconds (default: 30).
Returns: Dictionary with status, pipeline_id, pipeline_name, activity_name, workspace_name, and message.
Example: ```python # First, get the lakehouse and connection IDs lakehouses = list_items(workspace_name="Analytics", item_type="Lakehouse") lakehouse_id = lakehouses["items"][0]["id"] lakehouse_conn_id = "a216973e-47d7-4224-bb56-2c053bac6831"
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| workspace_name | Yes | ||
| pipeline_name | Yes | ||
| source_type | Yes | ||
| source_connection_id | Yes | ||
| source_table_schema | Yes | ||
| source_table_name | Yes | ||
| destination_lakehouse_id | Yes | ||
| destination_connection_id | Yes | ||
| destination_table_name | Yes | ||
| activity_name | No | ||
| source_access_mode | No | direct | |
| source_sql_query | No | ||
| table_action_option | No | Append | |
| apply_v_order | No | ||
| timeout | No | 0.12:00:00 | |
| retry | No | ||
| retry_interval_seconds | No |