Skip to main content
Glama

Gemini MCP Server

by mintmcqueen

batch_download_results

Download and parse completed batch job results from Gemini AI models. Retrieves JSONL files, extracts response data with metadata, and saves to local storage for analysis.

Instructions

DOWNLOAD BATCH RESULTS - Download and parse results from completed batch job. WORKFLOW: 1) Checks job status (must be SUCCEEDED), 2) Downloads result file from Gemini API, 3) Parses JSONL results, 4) Saves to local file, 5) Returns parsed results array. RETURNS: Array of results with original keys, responses, and metadata. Also saves to file in outputLocation.

Input Schema

NameRequiredDescriptionDefault
batchNameYesBatch job name/ID from batch_create
outputLocationNoDirectory to save results file (defaults to current working directory)

Input Schema (JSON Schema)

{ "properties": { "batchName": { "description": "Batch job name/ID from batch_create", "type": "string" }, "outputLocation": { "description": "Directory to save results file (defaults to current working directory)", "type": "string" } }, "required": [ "batchName" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mintmcqueen/gemini-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server