Skip to main content
Glama

delete_batch_job

Remove a Dataproc batch job from Google Cloud by specifying project ID, region, and batch ID to manage serverless batch operations.

Instructions

Delete a batch job.

Args: project_id: Google Cloud project ID region: Dataproc region batch_id: Batch job ID to delete

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
project_idYes
regionYes
batch_idYes

Implementation Reference

  • MCP tool handler and registration for 'delete_batch_job'. Decorated with @mcp.tool(), delegates to DataprocBatchClient.delete_batch_job.
    @mcp.tool() async def delete_batch_job(project_id: str, region: str, batch_id: str) -> str: """Delete a batch job. Args: project_id: Google Cloud project ID region: Dataproc region batch_id: Batch job ID to delete """ batch_client = DataprocBatchClient() try: result = await batch_client.delete_batch_job(project_id, region, batch_id) return str(result) except Exception as e: logger.error("Failed to delete batch job", error=str(e)) return f"Error: {str(e)}"
  • Core implementation of delete_batch_job in DataprocBatchClient. Calls Google Cloud Dataproc API to delete the batch job.
    async def delete_batch_job( self, project_id: str, region: str, batch_id: str ) -> dict[str, Any]: """Delete a batch job.""" try: loop = asyncio.get_event_loop() client = self._get_batch_client(region) request = types.DeleteBatchRequest( name=f"projects/{project_id}/locations/{region}/batches/{batch_id}" ) await loop.run_in_executor(None, client.delete_batch, request) return { "batch_id": batch_id, "status": "DELETED", "message": f"Batch job {batch_id} deletion initiated", } except Exception as e: logger.error("Failed to delete batch job", error=str(e)) raise

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/warrenzhu25/dataproc-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server