upload_file_to_dbfs
Transfer local files to Databricks File System (DBFS) for temporary storage, scripts, or smaller datasets. Handles large files via chunked upload and provides JSON output with success status, file size, and upload time.
Instructions
Upload a local file to Databricks File System (DBFS).
Args:
local_file_path: Path to local file (e.g. './data/notebook.py')
dbfs_path: DBFS path (e.g. '/tmp/uploaded/notebook.py')
overwrite: Whether to overwrite existing file (default: True)
Returns:
JSON with upload results including success status, file size, and upload time.
Example:
# Upload script to DBFS
result = upload_file_to_dbfs(
local_file_path='./scripts/analysis.py',
dbfs_path='/tmp/analysis.py',
overwrite=True
)
Note: For large files (>10MB), uses chunked upload with proper retry logic.
DBFS is good for temporary files, scripts, and smaller datasets.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
dbfs_path | Yes | ||
local_file_path | Yes | ||
overwrite | No |