upload_file_to_volume
Transfer local files to Databricks Unity Catalog volumes for efficient processing. Supports large files with progress tracking, error handling, and optional overwrite.
Instructions
Upload a local file to a Databricks Unity Catalog volume.
Args:
local_file_path: Path to local file (e.g. './data/products.json')
volume_path: Full volume path (e.g. '/Volumes/catalog/schema/volume/file.json')
overwrite: Whether to overwrite existing file (default: False)
Returns:
JSON with upload results including success status, file size in MB, and upload time.
Example:
# Upload large dataset to volume
result = upload_file_to_volume(
local_file_path='./stark_export/products_full.json',
volume_path='/Volumes/kbqa/stark_mas_eval/stark_raw_data/products_full.json',
overwrite=True
)
Note: Handles large files (multi-GB) with progress tracking and proper error handling.
Perfect for uploading extracted datasets to Unity Catalog volumes for processing.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
local_file_path | Yes | ||
overwrite | No | ||
volume_path | Yes |
Input Schema (JSON Schema)
{
"properties": {
"local_file_path": {
"title": "Local File Path",
"type": "string"
},
"overwrite": {
"default": false,
"title": "Overwrite",
"type": "boolean"
},
"volume_path": {
"title": "Volume Path",
"type": "string"
}
},
"required": [
"local_file_path",
"volume_path"
],
"title": "upload_file_to_volumeArguments",
"type": "object"
}