Skip to main content
Glama

export

Read-only

Export large SQL query results or saved Metabase cards to CSV, JSON, or XLSX format. Automatically saves files to Downloads/Metabase folder, supporting up to 1 million rows of data.

Instructions

Unified command to export large SQL query results or saved cards using Metabase export endpoints (supports up to 1M rows). Returns data in specified format (CSV, JSON, or XLSX) and automatically saves to Downloads/Metabase folder.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
database_idNoDatabase ID to export query from (SQL mode only)
queryNoSQL query to execute and export (SQL mode only)
card_idNoID of saved card to export (card mode only)
native_parametersNoParameters for SQL template variables like {{variable_name}} (SQL mode only)
card_parametersNoParameters for filtering card results before export (card mode only). Each parameter must follow Metabase format: {id: "uuid", slug: "param_name", target: ["dimension", ["template-tag", "param_name"]], type: "param_type", value: "param_value"}
formatNoExport format: csv (text), json (structured data), or xlsx (Excel file)csv
filenameNoCustom filename (without extension) for the saved file. If not provided, a timestamp-based name will be used.

Implementation Reference

  • Main handler function for the 'export' tool. Validates parameters and dispatches to card export or SQL query export implementations.
    export async function handleExport(
      request: CallToolRequest,
      requestId: string,
      apiClient: MetabaseApiClient,
      logDebug: (message: string, data?: unknown) => void,
      logInfo: (message: string, data?: unknown) => void,
      logWarn: (message: string, data?: unknown, error?: Error) => void,
      logError: (message: string, error: unknown) => void
    ): Promise<ExportResponse> {
      const args = request.params?.arguments as ExportRequest;
    
      const databaseId = args?.database_id;
      const query = args?.query;
      const cardId = args?.card_id;
      const nativeParameters = Array.isArray(args?.native_parameters) ? args.native_parameters : [];
      const cardParameters = Array.isArray(args?.card_parameters) ? args.card_parameters : [];
      const format = validateEnumValue(
        args?.format || 'csv',
        ['csv', 'json', 'xlsx'] as const,
        'format',
        requestId,
        logWarn
      );
      const filename = args?.filename;
    
      // Validate that either query+database_id or card_id is provided (but not considering 0 as falsy for this check)
      if (cardId === undefined && databaseId === undefined) {
        logWarn('Missing required parameters: either card_id or database_id must be provided', {
          requestId,
        });
        throw new McpError(
          ErrorCode.InvalidParams,
          'Either card_id or database_id parameter is required'
        );
      }
    
      if (cardId !== undefined && databaseId !== undefined) {
        logWarn('Both card_id and database_id provided - only one is allowed', { requestId });
        throw new McpError(
          ErrorCode.InvalidParams,
          'Cannot specify both card_id and database_id - choose one export method'
        );
      }
    
      // Validate positive integer parameters
      if (cardId !== undefined) {
        validatePositiveInteger(cardId, 'card_id', requestId, logWarn);
      }
      if (databaseId !== undefined) {
        validatePositiveInteger(databaseId, 'database_id', requestId, logWarn);
      }
    
      // Strict parameter validation for card export mode
      if (cardId !== undefined) {
        // For card export, only card_id, card_parameters, format, and filename are allowed
        if (query || databaseId !== undefined || (nativeParameters && nativeParameters.length > 0)) {
          logWarn('Invalid parameters for card export mode', {
            requestId,
            invalidParams: {
              query: query ? 'provided' : 'not provided',
              database_id: databaseId ? 'provided' : 'not provided',
              native_parameters: nativeParameters?.length > 0 ? 'provided' : 'not provided',
            },
          });
          throw new McpError(
            ErrorCode.InvalidParams,
            'Card export mode only allows card_id, card_parameters, format, and filename parameters'
          );
        }
      }
    
      // Strict parameter validation for SQL export mode
      if (databaseId !== undefined) {
        // For SQL export, only database_id, query, native_parameters, format, and filename are allowed
        if (cardId !== undefined || (cardParameters && cardParameters.length > 0)) {
          logWarn('Invalid parameters for SQL export mode', {
            requestId,
            invalidParams: {
              card_id: cardId ? 'provided' : 'not provided',
              card_parameters: cardParameters?.length > 0 ? 'provided' : 'not provided',
            },
          });
          throw new McpError(
            ErrorCode.InvalidParams,
            'SQL export mode only allows database_id, query, native_parameters, format, and filename parameters'
          );
        }
      }
    
      // If exporting a card
      if (cardId !== undefined) {
        validatePositiveInteger(cardId, 'card_id', requestId, logWarn);
    
        // Validate card parameters format if provided
        if (cardParameters.length > 0) {
          try {
            validateCardParameters(cardParameters, requestId, logWarn);
          } catch (error) {
            logWarn(`Card parameter validation failed for card ${cardId}`, { error, requestId });
            throw new McpError(
              ErrorCode.InvalidParams,
              `Invalid card parameters format. If parameter issues persist, consider using export_query with the card's underlying SQL query instead, which provides more reliable parameter handling. Original error: ${error instanceof Error ? error.message : String(error)}`
            );
          }
        }
    
        const cardParams: CardExportParams = {
          cardId,
          cardParameters,
          format: format as 'csv' | 'json' | 'xlsx',
          filename,
        };
    
        return await exportCard(cardParams, requestId, apiClient, logDebug, logInfo, logWarn, logError);
      }
    
      // If exporting a SQL query
      if (!query || typeof query !== 'string') {
        logWarn('Missing or invalid query parameter in export request', { requestId });
        throw new McpError(
          ErrorCode.InvalidParams,
          'SQL query parameter is required and must be a string'
        );
      }
    
      validatePositiveInteger(databaseId, 'database_id', requestId, logWarn);
    
      const sqlParams: SqlExportParams = {
        databaseId: databaseId as number,
        query,
        nativeParameters,
        format: format as 'csv' | 'json' | 'xlsx',
        filename,
      };
    
      return await exportSqlQuery(
        sqlParams,
        requestId,
        apiClient,
        logDebug,
        logInfo,
        logWarn,
        logError
      );
    }
  • JSON schema definition for the 'export' tool input parameters as registered in ListToolsRequestHandler.
    {
      name: 'export',
      description:
        'Unified command to export large SQL query results or saved cards using Metabase export endpoints (supports up to 1M rows). Returns data in specified format (CSV, JSON, or XLSX) and automatically saves to Downloads/Metabase folder.',
      annotations: {
        readOnlyHint: true,
        destructiveHint: false,
        idempotentHint: false,
        openWorldHint: true,
      },
      inputSchema: {
        type: 'object',
        properties: {
          database_id: {
            type: 'number',
            description: 'Database ID to export query from (SQL mode only)',
          },
          query: {
            type: 'string',
            description: 'SQL query to execute and export (SQL mode only)',
          },
          card_id: {
            type: 'number',
            description: 'ID of saved card to export (card mode only)',
          },
          native_parameters: {
            type: 'array',
            items: { type: 'object' },
            description:
              'Parameters for SQL template variables like {{variable_name}} (SQL mode only)',
          },
          card_parameters: {
            type: 'array',
            items: { type: 'object' },
            description:
              'Parameters for filtering card results before export (card mode only). Each parameter must follow Metabase format: {id: "uuid", slug: "param_name", target: ["dimension", ["template-tag", "param_name"]], type: "param_type", value: "param_value"}',
          },
          format: {
            type: 'string',
            enum: ['csv', 'json', 'xlsx'],
            description:
              'Export format: csv (text), json (structured data), or xlsx (Excel file)',
            default: 'csv',
          },
          filename: {
            type: 'string',
            description:
              'Custom filename (without extension) for the saved file. If not provided, a timestamp-based name will be used.',
          },
        },
        required: [],
      },
    },
  • src/server.ts:543-554 (registration)
    Registration of the 'export' tool handler in the CallToolRequestHandler switch statement.
    case 'export':
      return safeCall(() =>
        handleExport(
          request,
          requestId,
          this.apiClient,
          this.logDebug.bind(this),
          this.logInfo.bind(this),
          this.logWarn.bind(this),
          this.logError.bind(this)
        )
      );
  • Helper function implementing card export logic: API call, response processing, file saving to Downloads/Metabase.
    export async function exportCard(
      params: CardExportParams,
      requestId: string,
      apiClient: MetabaseApiClient,
      logDebug: (message: string, data?: unknown) => void,
      logInfo: (message: string, data?: unknown) => void,
      logWarn: (message: string, data?: unknown, error?: Error) => void,
      logError: (message: string, error: unknown) => void
    ): Promise<ExportResponse> {
      const { cardId, cardParameters, format, filename } = params;
    
      logDebug(`Exporting card ${cardId} in ${format} format`);
    
      try {
        // First, fetch the card to get its name for filename purposes
        let cardName = `card_${cardId}`;
        try {
          const cardResponse = await apiClient.getCard(cardId);
          if (cardResponse.data.name) {
            cardName = cardResponse.data.name;
          }
        } catch (cardError) {
          logWarn(`Failed to fetch card name for card ${cardId}`, cardError);
        }
    
        // Use the export endpoint which supports larger result sets (up to 1M rows)
        const exportEndpoint = `/api/card/${cardId}/query/${format}`;
    
        // Build the request body with parameters if provided
        const requestBody = cardParameters.length > 0 ? { parameters: cardParameters } : {};
    
        // For export endpoints, we need to handle different response types
        const url = new URL(exportEndpoint, config.METABASE_URL);
        const headers: Record<string, string> = {
          'Content-Type': 'application/json',
        };
    
        // Add appropriate authentication headers
        if (authMethod === AuthMethod.API_KEY && config.METABASE_API_KEY) {
          headers['X-API-KEY'] = config.METABASE_API_KEY;
        } else if (authMethod === AuthMethod.SESSION && apiClient.sessionToken) {
          headers['X-Metabase-Session'] = apiClient.sessionToken;
        }
    
        const response = await fetch(url.toString(), {
          method: 'POST',
          headers,
          body: JSON.stringify(requestBody),
        });
    
        if (!response.ok) {
          const errorData = await response.json().catch(() => ({}));
          // Extract actual error message from response body, fallback to statusText
          const actualError = errorData?.message || errorData?.error || response.statusText;
          const errorMessage = `Export API request failed with status ${response.status}: ${actualError}`;
          logWarn(errorMessage, errorData);
          throw {
            status: response.status,
            message: actualError,
            data: errorData,
          };
        }
    
        // Handle different response types based on format
        let responseData;
        let rowCount: number | undefined = 0;
        let fileSize = 0;
    
        if (format === 'json') {
          responseData = await response.json();
    
          // Check for embedded errors (Metabase returns 200/202 with errors for invalid queries)
          validateMetabaseResponse(
            responseData,
            { operation: 'Card export', resourceId: cardId },
            logError
          );
    
          // JSON export format might have different structures, let's be more flexible
          if (responseData && typeof responseData === 'object') {
            // Try different possible structures for row counting
            rowCount =
              responseData?.data?.rows?.length ??
              responseData?.rows?.length ??
              (Array.isArray(responseData) ? responseData.length : 0);
          }
          logDebug(`JSON export row count: ${rowCount}`);
        } else if (format === 'csv') {
          responseData = await response.text();
    
          // Check if Metabase returned JSON error instead of CSV (starts with '{')
          if (responseData.trim().startsWith('{')) {
            let errorResponse;
            try {
              errorResponse = JSON.parse(responseData);
            } catch {
              // Not valid JSON, continue with CSV processing
              errorResponse = null;
            }
            if (errorResponse) {
              validateMetabaseResponse(
                errorResponse,
                { operation: 'Card export', resourceId: cardId },
                logError
              );
            }
          }
    
          // Count rows for CSV (subtract header row)
          const rows = responseData.split('\n').filter((row: string) => row.trim());
          rowCount = Math.max(0, rows.length - 1);
          logDebug(`CSV export row count: ${rowCount}`);
        } else if (format === 'xlsx') {
          responseData = await response.arrayBuffer();
          fileSize = responseData.byteLength;
    
          // Check if Metabase returned JSON error instead of XLSX
          // Valid XLSX files start with PK (ZIP signature), not '{', so check first bytes
          const textContent = new TextDecoder().decode(responseData);
          if (textContent.trim().startsWith('{')) {
            let errorResponse;
            try {
              errorResponse = JSON.parse(textContent);
            } catch {
              // Not valid JSON, continue with XLSX processing
              errorResponse = null;
            }
            if (errorResponse) {
              validateMetabaseResponse(
                errorResponse,
                { operation: 'Card export', resourceId: cardId },
                logError
              );
            }
          }
    
          // Analyze XLSX content to get accurate row count and data validation
          const xlsxAnalysis = analyzeXlsxContent(responseData);
          rowCount = xlsxAnalysis.rowCount;
    
          logDebug(
            `XLSX export - file size: ${fileSize} bytes, rows: ${rowCount}, has data: ${xlsxAnalysis.hasData}`
          );
        }
    
        // Validate that we have data before proceeding with file operations
        // Check row count for all formats
        const hasData = rowCount !== null && rowCount !== undefined && rowCount > 0;
        if (!hasData) {
          logWarn(`Card ${cardId} returned no data for export`, { requestId });
          return {
            content: [
              {
                type: 'text',
                text: formatJson({
                  success: false,
                  error: 'Card returned no data to export',
                }),
              },
            ],
          };
        }
    
        // Always save files to Downloads/Metabase directory
        const timestamp = new Date().toISOString().slice(0, 19).replace(/:/g, '-');
        const sanitizedCustomFilename = sanitizeFilename(filename);
        const sanitizedCardName = sanitizeFilename(cardName);
        const baseFilename = sanitizedCustomFilename || `${sanitizedCardName}_${timestamp}`;
        const finalFilename = `${baseFilename}.${format}`;
    
        // Use configured export directory
        const exportDirectory = config.EXPORT_DIRECTORY;
        const savedFilePath = path.join(exportDirectory, finalFilename);
    
        let fileSaveError: string | undefined;
    
        try {
          // Ensure export directory exists
          if (!fs.existsSync(exportDirectory)) {
            fs.mkdirSync(exportDirectory, { recursive: true });
          }
    
          // Write the file based on format and calculate file size
          if (format === 'json') {
            const jsonString = JSON.stringify(responseData, null, 2);
            fs.writeFileSync(savedFilePath, jsonString, 'utf8');
            fileSize = Buffer.byteLength(jsonString, 'utf8');
          } else if (format === 'csv') {
            fs.writeFileSync(savedFilePath, responseData, 'utf8');
            fileSize = Buffer.byteLength(responseData, 'utf8');
          } else if (format === 'xlsx') {
            // Handle binary data for XLSX
            if (responseData instanceof ArrayBuffer) {
              const buffer = Buffer.from(responseData);
              fs.writeFileSync(savedFilePath, buffer);
              fileSize = buffer.length;
            } else {
              throw new Error('XLSX response is not in expected ArrayBuffer format');
            }
          }
    
          logInfo(`Successfully exported to ${savedFilePath}`);
        } catch (saveError) {
          fileSaveError = saveError instanceof Error ? saveError.message : 'Unknown file save error';
          logError(`Failed to save export file: ${fileSaveError}`, saveError);
        }
    
        // Generate standardized JSON response
        if (fileSaveError) {
          const errorResponse: any = {
            success: false,
            error: fileSaveError,
          };
    
          return {
            content: [
              {
                type: 'text',
                text: formatJson(errorResponse),
              },
            ],
            isError: true,
          };
        }
    
        // Extract preview data (first 5 rows) for the response
        const previewData = extractPreviewData(responseData, format);
    
        // Successful export - return standardized JSON response
        const successResponse: any = {
          success: true,
          file_path: savedFilePath,
          row_count: rowCount,
          file_size_bytes: fileSize,
          preview_data: previewData,
        };
    
        return {
          content: [
            {
              type: 'text',
              text: formatJson(successResponse),
            },
          ],
        };
      } catch (error: any) {
        throw handleApiError(
          error,
          { operation: 'Export card', resourceType: 'card', resourceId: cardId },
          logError
        );
      }
    }
  • Helper function implementing SQL query export logic: API call to dataset export endpoint, file saving.
    export async function exportSqlQuery(
      params: SqlExportParams,
      requestId: string,
      apiClient: MetabaseApiClient,
      logDebug: (message: string, data?: unknown) => void,
      logInfo: (message: string, data?: unknown) => void,
      logWarn: (message: string, data?: unknown, error?: Error) => void,
      logError: (message: string, error: unknown) => void
    ): Promise<ExportResponse> {
      const { databaseId, query, nativeParameters, format, filename } = params;
    
      logDebug(`Exporting SQL query in ${format} format from database ID: ${databaseId}`);
    
      try {
        // Build query request body according to Metabase export API requirements
        const queryData = {
          type: 'native',
          native: {
            query: query,
            template_tags: {},
          },
          parameters: nativeParameters,
          database: databaseId,
        };
    
        // Use the export endpoint which supports larger result sets (up to 1M rows)
        const exportEndpoint = `/api/dataset/${format}`;
    
        // Build the request body with required parameters as per API documentation
        const requestBody = {
          query: queryData,
          format_rows: false,
          pivot_results: false,
          visualization_settings: {},
        };
    
        // For export endpoints, we need to handle different response types
        const url = new URL(exportEndpoint, config.METABASE_URL);
        const headers: Record<string, string> = {
          'Content-Type': 'application/json',
        };
    
        // Add appropriate authentication headers
        if (authMethod === AuthMethod.API_KEY && config.METABASE_API_KEY) {
          headers['X-API-KEY'] = config.METABASE_API_KEY;
        } else if (authMethod === AuthMethod.SESSION && apiClient.sessionToken) {
          headers['X-Metabase-Session'] = apiClient.sessionToken;
        }
    
        const response = await fetch(url.toString(), {
          method: 'POST',
          headers,
          body: JSON.stringify(requestBody),
        });
    
        if (!response.ok) {
          const errorData = await response.json().catch(() => ({}));
          // Extract actual error message from response body, fallback to statusText
          const actualError = errorData?.message || errorData?.error || response.statusText;
          const errorMessage = `Export API request failed with status ${response.status}: ${actualError}`;
          logWarn(errorMessage, errorData);
          throw {
            status: response.status,
            message: actualError,
            data: errorData,
          };
        }
    
        // Handle different response types based on format
        let responseData;
        let rowCount: number | undefined = 0;
        let fileSize = 0;
    
        if (format === 'json') {
          responseData = await response.json();
    
          // Check for embedded errors (Metabase returns 200/202 with errors for invalid queries)
          validateMetabaseResponse(
            responseData,
            { operation: 'SQL query export', resourceId: databaseId },
            logError
          );
    
          // JSON export format might have different structures, let's be more flexible
          if (responseData && typeof responseData === 'object') {
            // Try different possible structures for row counting
            rowCount =
              responseData?.data?.rows?.length ??
              responseData?.rows?.length ??
              (Array.isArray(responseData) ? responseData.length : 0);
          }
          logDebug(`JSON export row count: ${rowCount}`);
        } else if (format === 'csv') {
          responseData = await response.text();
    
          // Check if Metabase returned JSON error instead of CSV (starts with '{')
          if (responseData.trim().startsWith('{')) {
            let errorResponse;
            try {
              errorResponse = JSON.parse(responseData);
            } catch {
              // Not valid JSON, continue with CSV processing
              errorResponse = null;
            }
            if (errorResponse) {
              validateMetabaseResponse(
                errorResponse,
                { operation: 'SQL query export', resourceId: databaseId },
                logError
              );
            }
          }
    
          // Count rows for CSV (subtract header row)
          const rows = responseData.split('\n').filter((row: string) => row.trim());
          rowCount = Math.max(0, rows.length - 1);
          logDebug(`CSV export row count: ${rowCount}`);
        } else if (format === 'xlsx') {
          responseData = await response.arrayBuffer();
          fileSize = responseData.byteLength;
    
          // Check if Metabase returned JSON error instead of XLSX
          // Valid XLSX files start with PK (ZIP signature), not '{', so check first bytes
          const textContent = new TextDecoder().decode(responseData);
          if (textContent.trim().startsWith('{')) {
            let errorResponse;
            try {
              errorResponse = JSON.parse(textContent);
            } catch {
              // Not valid JSON, continue with XLSX processing
              errorResponse = null;
            }
            if (errorResponse) {
              validateMetabaseResponse(
                errorResponse,
                { operation: 'SQL query export', resourceId: databaseId },
                logError
              );
            }
          }
    
          // Analyze XLSX content to get accurate row count and data validation
          const xlsxAnalysis = analyzeXlsxContent(responseData);
          rowCount = xlsxAnalysis.rowCount;
    
          logDebug(
            `XLSX export - file size: ${fileSize} bytes, rows: ${rowCount}, has data: ${xlsxAnalysis.hasData}`
          );
        }
    
        // Validate that we have data before proceeding with file operations
        // For all formats including XLSX, check row count
        const hasData = rowCount !== null && rowCount !== undefined && rowCount > 0;
        if (!hasData) {
          logWarn(`Query returned no data for export`, { requestId });
          return {
            content: [
              {
                type: 'text',
                text: formatJson({
                  success: false,
                  error: 'Query returned no data to export',
                }),
              },
            ],
          };
        }
    
        // Always save files to Downloads/Metabase directory
        const timestamp = new Date().toISOString().slice(0, 19).replace(/:/g, '-');
        const sanitizedCustomFilename = sanitizeFilename(filename);
        const baseFilename = sanitizedCustomFilename || `metabase_export_${timestamp}`;
        const finalFilename = `${baseFilename}.${format}`;
    
        // Use configured export directory
        const exportDirectory = config.EXPORT_DIRECTORY;
        const savedFilePath = path.join(exportDirectory, finalFilename);
    
        let fileSaveError: string | undefined;
    
        try {
          // Ensure export directory exists
          if (!fs.existsSync(exportDirectory)) {
            fs.mkdirSync(exportDirectory, { recursive: true });
          }
    
          // Write the file based on format and calculate file size
          if (format === 'json') {
            const jsonString = JSON.stringify(responseData, null, 2);
            fs.writeFileSync(savedFilePath, jsonString, 'utf8');
            fileSize = Buffer.byteLength(jsonString, 'utf8');
          } else if (format === 'csv') {
            fs.writeFileSync(savedFilePath, responseData, 'utf8');
            fileSize = Buffer.byteLength(responseData, 'utf8');
          } else if (format === 'xlsx') {
            // Handle binary data for XLSX
            if (responseData instanceof ArrayBuffer) {
              const buffer = Buffer.from(responseData);
              fs.writeFileSync(savedFilePath, buffer);
              fileSize = buffer.length;
            } else {
              throw new Error('XLSX response is not in expected ArrayBuffer format');
            }
          }
    
          logInfo(`Successfully exported to ${savedFilePath}`);
        } catch (saveError) {
          fileSaveError = saveError instanceof Error ? saveError.message : 'Unknown file save error';
          logError(`Failed to save export file: ${fileSaveError}`, saveError);
        }
    
        // Generate standardized JSON response
        if (fileSaveError) {
          const errorResponse: any = {
            success: false,
            error: fileSaveError,
          };
    
          return {
            content: [
              {
                type: 'text',
                text: formatJson(errorResponse),
              },
            ],
            isError: true,
          };
        }
    
        // Extract preview data (first 5 rows) for the response
        const previewData = extractPreviewData(responseData, format);
    
        // Successful export - return standardized JSON response
        const successResponse: any = {
          success: true,
          file_path: savedFilePath,
          row_count: rowCount,
          file_size_bytes: fileSize,
          preview_data: previewData,
        };
    
        return {
          content: [
            {
              type: 'text',
              text: formatJson(successResponse),
            },
          ],
        };
      } catch (error: any) {
        throw handleApiError(
          error,
          { operation: 'Export query', resourceType: 'database', resourceId: databaseId },
          logError
        );
      }
    }
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Annotations already indicate read-only and non-destructive behavior, but the description adds valuable context beyond this: it specifies the scale limit ('up to 1M rows'), automatic saving to 'Downloads/Metabase folder', and support for multiple export formats. This enhances transparency about operational constraints and outcomes without contradicting annotations.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is appropriately sized and front-loaded, with a single sentence that efficiently conveys the core functionality, scale, formats, and saving behavior without unnecessary details. Every element earns its place by adding value, such as the row limit and folder path, making it concise and well-structured.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the tool's complexity (7 parameters, no output schema) and rich annotations, the description is mostly complete: it covers purpose, scale, formats, and saving behavior. However, it lacks details on error handling, authentication needs, or rate limits, which could be useful for a tool handling large exports. With annotations covering safety, this is a minor gap.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

With 100% schema description coverage, the input schema fully documents all 7 parameters, including their purposes and constraints. The description adds minimal parameter semantics, only mentioning 'specified format (CSV, JSON, or XLSX)' and 'automatically saves to Downloads/Metabase folder', which aligns with the schema but doesn't provide significant additional meaning. This meets the baseline for high schema coverage.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the tool's purpose with specific verbs ('export large SQL query results or saved cards') and resources ('Metabase export endpoints'), distinguishing it from siblings like 'execute' or 'retrieve' by focusing on data export rather than execution or retrieval. It specifies the scale ('up to 1M rows'), output formats, and automatic saving behavior, making the purpose unambiguous and differentiated.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides clear context for when to use this tool ('export large SQL query results or saved cards') and implies usage for bulk data export scenarios. However, it does not explicitly state when not to use it or name alternatives among siblings (e.g., 'execute' for running queries without export, 'retrieve' for fetching data without saving), leaving some guidance gaps.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jerichosequitin/metabase-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server