Skip to main content
Glama
HenkDz

PostgreSQL MCP Server

pg_export_table_data

Export PostgreSQL table data to JSON or CSV files for analysis, backup, or data sharing purposes.

Instructions

Export table data to JSON or CSV format

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
connectionStringNo
tableNameYes
outputPathYesabsolute path to save the exported data
whereNo
limitNo
formatNojson

Implementation Reference

  • Core handler function that executes the logic for exporting PostgreSQL table data to JSON or CSV format, including database query, data formatting, and file writing.
    async function executeExportTableData(
      input: ExportTableDataInput,
      getConnectionString: GetConnectionStringFn
    ): Promise<{ tableName: string; rowCount: number; outputPath: string }> {
      const resolvedConnectionString = getConnectionString(input.connectionString);
      const db = DatabaseConnection.getInstance();
      const { tableName, outputPath, where, limit, format } = input;
      
      try {
        await db.connect(resolvedConnectionString);
        
        let query = `SELECT * FROM "${tableName}"`; // Consider quoting table name properly
        const params: unknown[] = [];
        
        if (where) {
          query += ` WHERE ${where}`; // SECURITY: Ensure 'where' is safe or validated if user-supplied
        }
        
        if (limit) {
          query += ` LIMIT ${limit}`;
        }
        
        const data = await db.query<Record<string, unknown>[]>(query, params);
        
        const dir = path.dirname(outputPath);
        // Use fs.promises.mkdir for cleaner async/await
        await fs.promises.mkdir(dir, { recursive: true });
        
        if (format === 'csv') {
          if (data.length === 0) {
            await fs.promises.writeFile(outputPath, '');
          } else {
            const headers = Object.keys(data[0]).join(',');
            const rows = data.map(row => 
              Object.values(row).map(value => {
                const stringValue = String(value); // Ensure value is a string
                return typeof value === 'string' ? `"${stringValue.replace(/"/g, '""')}"` : stringValue;
              }).join(',')
            );
            await fs.promises.writeFile(outputPath, [headers, ...rows].join('\n'));
          }
        } else {
          await fs.promises.writeFile(outputPath, JSON.stringify(data, null, 2));
        }
        
        return {
            tableName,
            rowCount: data.length,
            outputPath
        };
      } catch (error) {
        throw new McpError(ErrorCode.InternalError, `Failed to export data: ${error instanceof Error ? error.message : String(error)}`);
      } finally {
        await db.disconnect();
      }
    }
  • Zod input schema defining parameters for the pg_export_table_data tool, including table name, output path, optional filters, and format.
    const ExportTableDataInputSchema = z.object({
      connectionString: z.string().optional(),
      tableName: z.string(),
      outputPath: z.string().describe("absolute path to save the exported data"),
      where: z.string().optional(),
      limit: z.number().int().positive().optional(),
      format: z.enum(['json', 'csv']).optional().default('json'),
    });
    type ExportTableDataInput = z.infer<typeof ExportTableDataInputSchema>;
  • Tool object definition and export, including name, description, input schema reference, and wrapper execute function that handles validation and delegates to core handler.
    export const exportTableDataTool: PostgresTool = {
      name: 'pg_export_table_data',
      description: 'Export table data to JSON or CSV format',
      inputSchema: ExportTableDataInputSchema,
      async execute(params: unknown, getConnectionString: GetConnectionStringFn): Promise<ToolOutput> {
        const validationResult = ExportTableDataInputSchema.safeParse(params);
        if (!validationResult.success) {
          return { content: [{ type: 'text', text: `Invalid input: ${validationResult.error.format()}` }], isError: true };
        }
        try {
          const result = await executeExportTableData(validationResult.data, getConnectionString);
          return { content: [{ type: 'text', text: `Successfully exported ${result.rowCount} rows from ${result.tableName} to ${result.outputPath}` }] };
        } catch (error) {
          const errorMessage = error instanceof McpError ? error.message : (error instanceof Error ? error.message : String(error));
          return { content: [{ type: 'text', text: `Error exporting data: ${errorMessage}` }], isError: true };
        }
      }
    };
  • src/index.ts:225-257 (registration)
    Inclusion of the pg_export_table_data tool (via exportTableDataTool) in the central allTools array, which is passed to the MCP server constructor for registration and capability advertisement.
    const allTools: PostgresTool[] = [
      // Core Analysis & Debugging
      analyzeDatabaseTool,
      debugDatabaseTool,
      
      // Schema & Structure Management (Meta-Tools)
      manageSchemaTools,
      manageFunctionsTool,
      manageTriggersTools,
      manageIndexesTool,
      manageConstraintsTool,
      manageRLSTool,
      
      // User & Security Management
      manageUsersTool,
      
      // Query & Performance Management
      manageQueryTool,
      
      // Data Operations (Enhancement Tools)
      executeQueryTool,
      executeMutationTool,
      executeSqlTool,
      
      // Documentation & Metadata
      manageCommentsTool,
      
      // Data Migration & Monitoring
      exportTableDataTool,
      importTableDataTool,
      copyBetweenDatabasesTool,
      monitorDatabaseTool
    ];
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden but provides minimal behavioral context. It mentions output formats but doesn't disclose critical behaviors like: whether this requires write permissions to the output path, if it overwrites existing files, performance implications for large tables, authentication needs via connectionString, or error handling. The description doesn't contradict annotations (none exist), but fails to address important operational aspects.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Extremely concise single sentence that front-loads the core purpose. Every word earns its place with no redundancy or unnecessary elaboration. The structure is optimal for a basic description.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

For a 6-parameter tool with no annotations and no output schema, the description is inadequate. It doesn't explain what the tool returns (success/failure indicators, file metadata), doesn't cover important behavioral aspects (permissions, file overwriting, error conditions), and leaves most parameters unexplained. The conciseness comes at the cost of completeness.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters2/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is only 17% (1 of 6 parameters has a description). The description adds minimal value beyond the schema - it mentions JSON/CSV formats (covered by the enum) but doesn't explain parameter semantics like what 'where' clause syntax to use, how 'limit' interacts with filtering, or the purpose of 'connectionString' beyond being a string. It doesn't compensate for the low schema coverage.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the action ('Export') and resource ('table data') with specific output formats ('JSON or CSV format'). It distinguishes from siblings like pg_execute_query (which returns results directly) by focusing on file export, but doesn't explicitly differentiate from pg_copy_between_databases (which might also involve data movement).

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

No guidance on when to use this tool versus alternatives. It doesn't mention when to choose JSON vs CSV, when filtering/limiting is appropriate, or how it differs from siblings like pg_execute_query (which might return data without file export) or pg_copy_between_databases (which copies between databases).

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/HenkDz/postgresql-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server