Skip to main content
Glama

Dataproc MCP Server

by warrenzhu25

submit_job

Submit a job to a Dataproc cluster by specifying project ID, region, cluster name, job type, main file, and optional arguments, JAR files, and properties. Supports Spark, PySpark, Hive, Pig, and Hadoop job types.

Instructions

Submit a job to a Dataproc cluster.

Args: project_id: Google Cloud project ID region: Dataproc region cluster_name: Target cluster name job_type: Type of job (spark, pyspark, spark_sql, hive, pig, hadoop) main_file: Main file/class for the job args: Job arguments jar_files: JAR files to include properties: Job properties

Input Schema

NameRequiredDescriptionDefault
argsNo
cluster_nameYes
jar_filesNo
job_typeYes
main_fileYes
project_idYes
propertiesNo
regionYes

Input Schema (JSON Schema)

{ "properties": { "args": { "default": null, "items": { "type": "string" }, "title": "Args", "type": "array" }, "cluster_name": { "title": "Cluster Name", "type": "string" }, "jar_files": { "default": null, "items": { "type": "string" }, "title": "Jar Files", "type": "array" }, "job_type": { "title": "Job Type", "type": "string" }, "main_file": { "title": "Main File", "type": "string" }, "project_id": { "title": "Project Id", "type": "string" }, "properties": { "additionalProperties": { "type": "string" }, "default": null, "title": "Properties", "type": "object" }, "region": { "title": "Region", "type": "string" } }, "required": [ "project_id", "region", "cluster_name", "job_type", "main_file" ], "title": "submit_jobArguments", "type": "object" }

Other Tools from Dataproc MCP Server

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/warrenzhu25/dataproc-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server