ListDIJobs
Retrieve and manage data integration task configurations for Alibaba Cloud DataWorks, filtering by project, source, destination, and migration type for streamlined workflow organization.
Instructions
获取数据集成同步任务配置列表
Input Schema
Name | Required | Description | Default |
---|---|---|---|
DestinationDataSourceType | No | 目标端数据源类型,不填代表不限制,枚举值:Hologres,OSS-HDFS,OSS,MaxCompute,LogHub,StarRocks,DataHub,AnalyticDB_For_MySQL,Kafka,Hive | |
MigrationType | No | 同步类型,可选的枚举值有:- FullAndRealtimeIncremental(全量和实时增量)- RealtimeIncremental(实时增量)- Full(全量)- OfflineIncremental(离线增量)- FullAndOfflineIncremental(全量+离线增量) | |
Name | No | 导出任务的名称。名称必须唯一,即当前DataWorks工作空间中不能存在名称重复的导出任务 | |
PageNumber | No | 页码,从1开始,默认值为1 | |
PageSize | No | 每页显示的数据条数,默认为10条,最大为100条 | |
ProjectId | No | 工作空间的ID | |
SourceDataSourceType | No | 源端数据源类型,不填代表不限制,枚举值:PolarDB,MySQL,Kafka,LogHub,Hologres,Oracle,OceanBase,MongoDB,RedShift,Hive,SQLServer,Doris,ClickHouse |
Input Schema (JSON Schema)
{
"$schema": "http://json-schema.org/draft-07/schema#",
"additionalProperties": false,
"properties": {
"DestinationDataSourceType": {
"description": "目标端数据源类型,不填代表不限制,枚举值:Hologres,OSS-HDFS,OSS,MaxCompute,LogHub,StarRocks,DataHub,AnalyticDB_For_MySQL,Kafka,Hive",
"type": "string"
},
"MigrationType": {
"description": "同步类型,可选的枚举值有:- FullAndRealtimeIncremental(全量和实时增量)- RealtimeIncremental(实时增量)- Full(全量)- OfflineIncremental(离线增量)- FullAndOfflineIncremental(全量+离线增量)",
"type": "string"
},
"Name": {
"description": "导出任务的名称。名称必须唯一,即当前DataWorks工作空间中不能存在名称重复的导出任务",
"type": "string"
},
"PageNumber": {
"description": "页码,从1开始,默认值为1"
},
"PageSize": {
"description": "每页显示的数据条数,默认为10条,最大为100条"
},
"ProjectId": {
"description": "工作空间的ID"
},
"SourceDataSourceType": {
"description": "源端数据源类型,不填代表不限制,枚举值:PolarDB,MySQL,Kafka,LogHub,Hologres,Oracle,OceanBase,MongoDB,RedShift,Hive,SQLServer,Doris,ClickHouse",
"type": "string"
}
},
"type": "object"
}