Skip to main content
Glama
Desmond-Labs

Supabase Storage MCP

by Desmond-Labs

upload_image_batch

Upload multiple images to Supabase Storage using file paths or base64 data, organizing them in specified buckets and folders with batch tracking.

Instructions

Upload multiple images to designated bucket and folder (supports both file paths and base64 data)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
bucket_nameYesTarget bucket name
batch_idYesUnique batch identifier
folder_prefixYesFolder organization (original/processed)
user_idYesUser identifier
image_pathsNoLocal file paths to upload (for local testing)
image_dataNoBase64 encoded image data (for Claude Desktop compatibility)

Implementation Reference

  • Main handler function for the 'upload_image_batch' tool. Extracts arguments, validates input, prepares upload options, calls processBatchUpload from file-upload module, computes success rate, audits the request, and returns formatted response.
    async function handleUploadImageBatch(args: any, requestId: string, startTime: number) {
      const { bucket_name, batch_id, folder_prefix, user_id, image_paths, image_data } = args;
      
      // Validate input - must have either image_paths or image_data
      if (!image_paths && !image_data) {
        throw new Error('Either image_paths or image_data must be provided');
      }
      
      if (image_paths && image_data) {
        throw new Error('Cannot specify both image_paths and image_data - choose one');
      }
      
      const fileCount = image_paths ? image_paths.length : image_data.length;
      const inputHash = generateSecureHash(JSON.stringify({ bucket_name, batch_id, folder_prefix, user_id, fileCount }));
      
      try {
        const uploadOptions = {
          bucketName: bucket_name,
          batchId: batch_id,
          folderPrefix: folder_prefix,
          userId: user_id,
          supabase
        };
        
        let batchResult;
        
        if (image_paths) {
          // Use file paths (for local testing)
          batchResult = await processBatchUpload(image_paths, uploadOptions);
        } else {
          // Use base64 data (for Claude Desktop)
          batchResult = await processBatchUpload(image_data, uploadOptions);
        }
        
        const successRate = batchResult.total > 0 ? `${Math.round((batchResult.success_count / batchResult.total) * 100)}%` : '0%';
        
        auditRequest('upload_image_batch', batchResult.success_count > 0, inputHash);
        
        const response = {
          success: true,
          batch_id: batch_id,
          summary: {
            total_files: batchResult.total,
            successful_uploads: batchResult.success_count,
            failed_uploads: batchResult.error_count,
            success_rate: successRate
          },
          results: {
            successful: batchResult.successful,
            failed: batchResult.failed,
            total: batchResult.total,
            success_count: batchResult.success_count,
            error_count: batchResult.error_count
          },
          request_id: requestId,
          processing_time: Date.now() - startTime
        };
        
        return {
          content: [
            {
              type: 'text',
              text: JSON.stringify(response, null, 2)
            }
          ]
        };
      } catch (error) {
        auditRequest('upload_image_batch', false, inputHash, getErrorMessage(error));
        throw error;
      }
    }
  • Tool registration entry including name, description, and detailed input schema for 'upload_image_batch'. Defines validation rules for batch upload parameters supporting both local file paths and base64 image data.
    {
      name: 'upload_image_batch',
      description: 'Upload multiple images to designated bucket and folder (supports both file paths and base64 data)',
      inputSchema: {
        type: 'object',
        properties: {
          bucket_name: {
            type: 'string',
            description: 'Target bucket name',
            minLength: 3,
            maxLength: 63
          },
          batch_id: {
            type: 'string',
            description: 'Unique batch identifier',
            maxLength: 64
          },
          folder_prefix: {
            type: 'string',
            description: 'Folder organization (original/processed)',
            maxLength: 100
          },
          user_id: {
            type: 'string',
            description: 'User identifier',
            maxLength: 36
          },
          image_paths: {
            type: 'array',
            description: 'Local file paths to upload (for local testing)',
            items: { type: 'string', maxLength: 4096 },
            minItems: 1,
            maxItems: 500
          },
          image_data: {
            type: 'array',
            description: 'Base64 encoded image data (for Claude Desktop compatibility)',
            items: {
              type: 'object',
              properties: {
                filename: {
                  type: 'string',
                  description: 'Original filename with extension',
                  maxLength: 255
                },
                content: {
                  type: 'string',
                  description: 'Base64 encoded file content',
                  maxLength: 67108864 // ~50MB base64 limit
                },
                mime_type: {
                  type: 'string',
                  description: 'MIME type of the file',
                  enum: ['image/jpeg', 'image/png', 'image/webp', 'image/gif']
                }
              },
              required: ['filename', 'content', 'mime_type'],
              additionalProperties: false
            },
            minItems: 1,
            maxItems: 500
          }
        },
        required: ['bucket_name', 'batch_id', 'folder_prefix', 'user_id'],
        additionalProperties: false,
        oneOf: [
          { required: ['image_paths'] },
          { required: ['image_data'] }
        ]
      }
    },
  • Core helper function implementing batch image upload logic. Validates and processes each file (paths or base64), generates secure storage paths, performs individual uploads via uploadSingleFile, tracks success/failure, audits the operation, and returns detailed batch results.
    export async function processBatchUpload(
      inputData: string[] | Base64ImageData[],
      options: UploadOptions
    ): Promise<BatchUploadResult> {
      const results: UploadResult[] = [];
      let successCount = 0;
      let errorCount = 0;
    
      // Security: Validate batch size
      validateBatchSize(inputData.length);
    
      // Determine if input is file paths or base64 data
      const isBase64Input = inputData.length > 0 && typeof inputData[0] === 'object';
    
      // Process each file
      for (let i = 0; i < inputData.length; i++) {
        const input = inputData[i];
        let fileInfo: FileInfo;
        let identifier: string = `batch_item_${i}`;
        
        try {
          if (isBase64Input) {
            // Handle base64 input
            const base64Data = input as Base64ImageData;
            fileInfo = await validateAndReadBase64File(base64Data);
            identifier = base64Data.filename;
          } else {
            // Handle file path input
            const filePath = input as string;
            fileInfo = await validateAndReadFile(filePath);
            identifier = filePath;
          }
          
          // Generate storage path
          const storagePath = generateStoragePath(
            options.folderPrefix,
            options.userId,
            options.batchId,
            fileInfo.filename
          );
    
          // Upload file
          const result = await uploadSingleFile(fileInfo, storagePath, options);
          results.push(result);
    
          if (result.success) {
            successCount++;
          } else {
            errorCount++;
          }
    
        } catch (error) {
          const result: UploadResult = {
            original_path: identifier || `batch_item_${i}`,
            storage_path: '',
            file_id: '',
            success: false,
            error: getErrorMessage(error)
          };
          results.push(result);
          errorCount++;
        }
      }
    
      // Audit the batch operation
      auditRequest('upload_image_batch', successCount > 0, generateSecureHash(JSON.stringify({
        batch_id: options.batchId,
        bucket_name: options.bucketName,
        total_files: inputData.length,
        success_count: successCount
      })));
    
      return {
        successful: results.filter(r => r.success),
        failed: results.filter(r => !r.success),
        total: inputData.length,
        success_count: successCount,
        error_count: errorCount,
        batch_id: options.batchId,
        security_summary: {
          validations_passed: successCount,
          validations_failed: errorCount,
          risk_score_average: 0 // Low risk for successful file uploads
        }
      };
    }
  • src/index.ts:470-471 (registration)
    Dispatch case in the main CallToolRequestSchema handler that routes 'upload_image_batch' calls to the handleUploadImageBatch function.
    case 'upload_image_batch':
      return await handleUploadImageBatch(args, requestId, startTime);
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden for behavioral disclosure. While it mentions the upload action and supported formats, it lacks critical information: whether this is a mutating operation (implied but not stated), what permissions are required, whether there are rate limits or size constraints beyond the schema's maxItems, what happens on failure (partial uploads?), and what the response contains. For a batch upload tool with no annotation coverage, this is insufficient.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is a single, efficient sentence that communicates the core functionality without any wasted words. It's appropriately front-loaded with the main action and includes the key detail about dual input formats in a parenthetical. Every word earns its place.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

For a batch upload tool with 6 parameters, no annotations, and no output schema, the description is incomplete. It doesn't address critical behavioral aspects like mutation implications, error handling, response format, or usage context. The agent would need to infer too much about how this tool behaves in practice given its complexity and lack of structured metadata.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so the schema already documents all parameters thoroughly. The description adds minimal value by mentioning 'supports both file paths and base64 data' which corresponds to the image_paths and image_data parameters, but doesn't provide additional context beyond what's in the schema descriptions. This meets the baseline for high schema coverage.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the action ('upload multiple images') and target ('to designated bucket and folder'), with the specific detail about supporting both file paths and base64 data. However, it doesn't explicitly differentiate this batch upload tool from potential single-file upload siblings that might exist on the server (though none are listed among the siblings).

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides no guidance on when to use this tool versus alternatives like 'create_signed_urls' or 'list_files'. It mentions the two input formats (file paths vs base64) but doesn't explain when each format is appropriate (e.g., local testing vs remote scenarios). No prerequisites, exclusions, or alternative recommendations are provided.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Desmond-Labs/supabase-storage-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server