Skip to main content
Glama

upload_image_batch

Upload multiple images to Supabase Storage using file paths or base64 data, organizing them in specified buckets and folders with batch tracking.

Instructions

Upload multiple images to designated bucket and folder (supports both file paths and base64 data)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
bucket_nameYesTarget bucket name
batch_idYesUnique batch identifier
folder_prefixYesFolder organization (original/processed)
user_idYesUser identifier
image_pathsNoLocal file paths to upload (for local testing)
image_dataNoBase64 encoded image data (for Claude Desktop compatibility)

Implementation Reference

  • Main handler function for the 'upload_image_batch' tool. Extracts arguments, validates input, prepares upload options, calls processBatchUpload from file-upload module, computes success rate, audits the request, and returns formatted response.
    async function handleUploadImageBatch(args: any, requestId: string, startTime: number) { const { bucket_name, batch_id, folder_prefix, user_id, image_paths, image_data } = args; // Validate input - must have either image_paths or image_data if (!image_paths && !image_data) { throw new Error('Either image_paths or image_data must be provided'); } if (image_paths && image_data) { throw new Error('Cannot specify both image_paths and image_data - choose one'); } const fileCount = image_paths ? image_paths.length : image_data.length; const inputHash = generateSecureHash(JSON.stringify({ bucket_name, batch_id, folder_prefix, user_id, fileCount })); try { const uploadOptions = { bucketName: bucket_name, batchId: batch_id, folderPrefix: folder_prefix, userId: user_id, supabase }; let batchResult; if (image_paths) { // Use file paths (for local testing) batchResult = await processBatchUpload(image_paths, uploadOptions); } else { // Use base64 data (for Claude Desktop) batchResult = await processBatchUpload(image_data, uploadOptions); } const successRate = batchResult.total > 0 ? `${Math.round((batchResult.success_count / batchResult.total) * 100)}%` : '0%'; auditRequest('upload_image_batch', batchResult.success_count > 0, inputHash); const response = { success: true, batch_id: batch_id, summary: { total_files: batchResult.total, successful_uploads: batchResult.success_count, failed_uploads: batchResult.error_count, success_rate: successRate }, results: { successful: batchResult.successful, failed: batchResult.failed, total: batchResult.total, success_count: batchResult.success_count, error_count: batchResult.error_count }, request_id: requestId, processing_time: Date.now() - startTime }; return { content: [ { type: 'text', text: JSON.stringify(response, null, 2) } ] }; } catch (error) { auditRequest('upload_image_batch', false, inputHash, getErrorMessage(error)); throw error; } }
  • Tool registration entry including name, description, and detailed input schema for 'upload_image_batch'. Defines validation rules for batch upload parameters supporting both local file paths and base64 image data.
    { name: 'upload_image_batch', description: 'Upload multiple images to designated bucket and folder (supports both file paths and base64 data)', inputSchema: { type: 'object', properties: { bucket_name: { type: 'string', description: 'Target bucket name', minLength: 3, maxLength: 63 }, batch_id: { type: 'string', description: 'Unique batch identifier', maxLength: 64 }, folder_prefix: { type: 'string', description: 'Folder organization (original/processed)', maxLength: 100 }, user_id: { type: 'string', description: 'User identifier', maxLength: 36 }, image_paths: { type: 'array', description: 'Local file paths to upload (for local testing)', items: { type: 'string', maxLength: 4096 }, minItems: 1, maxItems: 500 }, image_data: { type: 'array', description: 'Base64 encoded image data (for Claude Desktop compatibility)', items: { type: 'object', properties: { filename: { type: 'string', description: 'Original filename with extension', maxLength: 255 }, content: { type: 'string', description: 'Base64 encoded file content', maxLength: 67108864 // ~50MB base64 limit }, mime_type: { type: 'string', description: 'MIME type of the file', enum: ['image/jpeg', 'image/png', 'image/webp', 'image/gif'] } }, required: ['filename', 'content', 'mime_type'], additionalProperties: false }, minItems: 1, maxItems: 500 } }, required: ['bucket_name', 'batch_id', 'folder_prefix', 'user_id'], additionalProperties: false, oneOf: [ { required: ['image_paths'] }, { required: ['image_data'] } ] } },
  • Core helper function implementing batch image upload logic. Validates and processes each file (paths or base64), generates secure storage paths, performs individual uploads via uploadSingleFile, tracks success/failure, audits the operation, and returns detailed batch results.
    export async function processBatchUpload( inputData: string[] | Base64ImageData[], options: UploadOptions ): Promise<BatchUploadResult> { const results: UploadResult[] = []; let successCount = 0; let errorCount = 0; // Security: Validate batch size validateBatchSize(inputData.length); // Determine if input is file paths or base64 data const isBase64Input = inputData.length > 0 && typeof inputData[0] === 'object'; // Process each file for (let i = 0; i < inputData.length; i++) { const input = inputData[i]; let fileInfo: FileInfo; let identifier: string = `batch_item_${i}`; try { if (isBase64Input) { // Handle base64 input const base64Data = input as Base64ImageData; fileInfo = await validateAndReadBase64File(base64Data); identifier = base64Data.filename; } else { // Handle file path input const filePath = input as string; fileInfo = await validateAndReadFile(filePath); identifier = filePath; } // Generate storage path const storagePath = generateStoragePath( options.folderPrefix, options.userId, options.batchId, fileInfo.filename ); // Upload file const result = await uploadSingleFile(fileInfo, storagePath, options); results.push(result); if (result.success) { successCount++; } else { errorCount++; } } catch (error) { const result: UploadResult = { original_path: identifier || `batch_item_${i}`, storage_path: '', file_id: '', success: false, error: getErrorMessage(error) }; results.push(result); errorCount++; } } // Audit the batch operation auditRequest('upload_image_batch', successCount > 0, generateSecureHash(JSON.stringify({ batch_id: options.batchId, bucket_name: options.bucketName, total_files: inputData.length, success_count: successCount }))); return { successful: results.filter(r => r.success), failed: results.filter(r => !r.success), total: inputData.length, success_count: successCount, error_count: errorCount, batch_id: options.batchId, security_summary: { validations_passed: successCount, validations_failed: errorCount, risk_score_average: 0 // Low risk for successful file uploads } }; }
  • src/index.ts:470-471 (registration)
    Dispatch case in the main CallToolRequestSchema handler that routes 'upload_image_batch' calls to the handleUploadImageBatch function.
    case 'upload_image_batch': return await handleUploadImageBatch(args, requestId, startTime);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Desmond-Labs/supabase-storage-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server