Skip to main content
Glama
HenkDz

PostgreSQL MCP Server

pg_import_table_data

Transfer data from JSON or CSV files into a PostgreSQL table. Specify the table name, file path, and optional truncation, format, or delimiter settings for precise integration.

Instructions

Import data from JSON or CSV file into a table

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
connectionStringNo
delimiterNo
formatNojson
inputPathYesabsolute path to the file to import
tableNameYes
truncateFirstNo

Implementation Reference

  • The core handler function that executes the logic for importing table data from JSON or CSV files into a PostgreSQL table. It handles file reading, parsing, optional truncation, and batched inserts within a transaction.
    async function executeImportTableData( input: ImportTableDataInput, getConnectionString: GetConnectionStringFn ): Promise<{ tableName: string; rowCount: number }> { const resolvedConnectionString = getConnectionString(input.connectionString); const db = DatabaseConnection.getInstance(); const { tableName, inputPath, truncateFirst, format, delimiter } = input; try { await db.connect(resolvedConnectionString); const fileContent = await fs.promises.readFile(inputPath, 'utf8'); let dataToImport: Record<string, unknown>[]; if (format === 'csv') { const csvDelimiter = delimiter || ','; const lines = fileContent.split('\n').filter(line => line.trim()); // Use \n consistently if (lines.length === 0) { return { tableName, rowCount: 0 }; } const headers = lines[0].split(csvDelimiter).map(h => h.trim().replace(/^"|"$/g, '')); // Remove surrounding quotes from headers dataToImport = lines.slice(1).map(line => { // Basic CSV parsing, might need a more robust library for complex CSVs const values = line.split(csvDelimiter).map(val => val.trim().replace(/^"|"$/g, '').replace(/""/g, '"')); return headers.reduce((obj, header, index) => { obj[header] = values[index] !== undefined ? values[index] : null; return obj; }, {} as Record<string, unknown>); }); } else { dataToImport = JSON.parse(fileContent); } if (!Array.isArray(dataToImport)) { throw new Error('Input file does not contain an array of records'); } if (truncateFirst) { await db.query(`TRUNCATE TABLE "${tableName}"`); // Consider quoting } let importedCount = 0; if (dataToImport.length > 0) { await db.transaction(async (client: import('pg').PoolClient) => { for (const record of dataToImport) { const columns = Object.keys(record); if (columns.length === 0) continue; // Skip empty records const values = Object.values(record); const placeholders = values.map((_, i) => `$${i + 1}`).join(', '); const query = ` INSERT INTO "${tableName}" (${columns.map(c => `"${c}"`).join(', ')}) VALUES (${placeholders}) `; await client.query(query, values); importedCount++; } }); } return { tableName, rowCount: importedCount }; } catch (error) { throw new McpError(ErrorCode.InternalError, `Failed to import data: ${error instanceof Error ? error.message : String(error)}`); } finally { await db.disconnect(); } }
  • Zod input schema defining the parameters for the pg_import_table_data tool: connection details, table name, input file path, truncation option, format, and delimiter.
    const ImportTableDataInputSchema = z.object({ connectionString: z.string().optional(), tableName: z.string(), inputPath: z.string().describe("absolute path to the file to import"), truncateFirst: z.boolean().optional().default(false), format: z.enum(['json', 'csv']).optional().default('json'), delimiter: z.string().optional(), });
  • The PostgresTool object registration for pg_import_table_data, including name, description, schema reference, and wrapper execute method that handles input validation and delegates to the core handler.
    export const importTableDataTool: PostgresTool = { name: 'pg_import_table_data', description: 'Import data from JSON or CSV file into a table', inputSchema: ImportTableDataInputSchema, async execute(params: unknown, getConnectionString: GetConnectionStringFn): Promise<ToolOutput> { const validationResult = ImportTableDataInputSchema.safeParse(params); if (!validationResult.success) { return { content: [{ type: 'text', text: `Invalid input: ${validationResult.error.format()}` }], isError: true }; } try { const result = await executeImportTableData(validationResult.data, getConnectionString); return { content: [{ type: 'text', text: `Successfully imported ${result.rowCount} rows into ${result.tableName}` }] }; } catch (error) { const errorMessage = error instanceof McpError ? error.message : (error instanceof Error ? error.message : String(error)); return { content: [{ type: 'text', text: `Error importing data: ${errorMessage}` }], isError: true }; } } };

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/HenkDz/postgresql-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server