submit_job
Submit a job to a Dataproc cluster by specifying project ID, region, cluster name, job type, main file, and optional arguments, JAR files, and properties. Supports Spark, PySpark, Hive, Pig, and Hadoop job types.
Instructions
Submit a job to a Dataproc cluster.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
args | No | ||
cluster_name | Yes | ||
jar_files | No | ||
job_type | Yes | ||
main_file | Yes | ||
project_id | Yes | ||
properties | No | ||
region | Yes |