bulk_import
Import large datasets from cloud or web storage (S3, Azure, Google, HTTP/HTTPS) into a CockroachDB table using CSV or Avro formats. Define delimiters and header options for tailored data ingestion.
Instructions
Bulk import data into a table from a file (CSV or Avro) stored in cloud or web storage. Supports S3, Azure Blob, Google Storage, HTTP/HTTPS URLs.
Args: table_name (str): Name of the table to import data into. file_url (str): URL to the data file (s3://, azure://, gs://, http://, https://, etc.). format (str): File format ('csv' or 'avro'). delimiter (str): CSV delimiter (default: ','). skip_header (bool): Whether to skip the first row as header (default: True).
Returns: A success message or an error message.
Example: bulk_import(ctx, table_name="users", file_url="s3://bucket/data.csv", format="csv", delimiter=";", skip_header=True)
Input Schema
Name | Required | Description | Default |
---|---|---|---|
delimiter | No | , | |
file_url | Yes | ||
format | Yes | ||
skip_header | No | ||
table_name | Yes |