upload_to_databricks#

upload_to_databricks(remote_path: str, local_path: str | Path) None[source]#

Upload a local file or directory to a Databricks volume.

This function handles uploading either a single file or an entire directory structure to a Databricks volume. It preserves directory hierarchies when uploading folders.

Parameters:
remote_pathstr

Path within the volume where file(s) will be uploaded.

local_pathstr or Path

Local file or directory path to upload.

Returns:
None

This function does not return any value but prints progress information to standard output.

Note

  • Progress is printed to standard output during upload.

  • Existing files at the destination will be overwritten without confirmation.

  • For single file uploads, the remote_path should include the target filename.

  • For directory uploads, the remote_path should be the target directory.

Warning

  • The function will fail if environment variables DATABRICKS_TOKEN and DATABRICKS_HOST are not set.

Examples

>>> # Upload a single file
>>> upload_to_databricks(
...     "data/reports/report.csv",
...     "./local_data/report.csv",
... )
Uploaded: ./local_data/report.csv to /Volumes/my_catalog/my_schema/my_volume/data/reports/report.csv
>>> # Upload a directory
>>> upload_to_databricks(
...     "data/reports",
...     "./local_data",
... )
Uploaded: ./local_data/file1.csv to /Volumes/my_catalog/my_schema/my_volume/data/reports/file1.csv
Uploaded: ./local_data/subdir/file2.csv to /Volumes/my_catalog/my_schema/my_volume/data/reports/subdir/file2.csv