Skip to main content
Glama

Gemini MCP Server

by mintmcqueen

batch_create_embeddings

Create batch embedding generation jobs for large-scale AI tasks at reduced cost. Process multiple text inputs simultaneously for semantic similarity, classification, clustering, and retrieval applications using Google's Gemini embedding model.

Instructions

CREATE EMBEDDINGS BATCH JOB - Create async embeddings generation batch job. COST: 50% cheaper than standard API. MODEL: gemini-embedding-001 (1536 dimensions). WORKFLOW: 1) Prepare content (use batch_ingest_embeddings for conversion), 2) Select task type (use batch_query_task_type if unsure), 3) Upload file, 4) Call batch_create_embeddings, 5) Monitor with batch_get_status, 6) Download with batch_download_results. TASK TYPES: See batch_query_task_type for descriptions and recommendations.

Input Schema

NameRequiredDescriptionDefault
modelNoEmbedding modelgemini-embedding-001
requestsNoInline embedding requests (for small batches)
inputFileUriNoURI of uploaded JSONL file with embedding requests
taskTypeYesEmbedding task type (affects model optimization). Use batch_query_task_type for guidance.
displayNameNoOptional display name for the batch job
outputLocationNoOutput directory for results

Input Schema (JSON Schema)

{ "properties": { "displayName": { "description": "Optional display name for the batch job", "type": "string" }, "inputFileUri": { "description": "URI of uploaded JSONL file with embedding requests", "type": "string" }, "model": { "default": "gemini-embedding-001", "description": "Embedding model", "enum": [ "gemini-embedding-001" ], "type": "string" }, "outputLocation": { "description": "Output directory for results", "type": "string" }, "requests": { "description": "Inline embedding requests (for small batches)", "type": "array" }, "taskType": { "description": "Embedding task type (affects model optimization). Use batch_query_task_type for guidance.", "enum": [ "SEMANTIC_SIMILARITY", "CLASSIFICATION", "CLUSTERING", "RETRIEVAL_DOCUMENT", "RETRIEVAL_QUERY", "CODE_RETRIEVAL_QUERY", "QUESTION_ANSWERING", "FACT_VERIFICATION" ], "type": "string" } }, "required": [ "taskType" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mintmcqueen/gemini-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server