I am trying to export the csv file by following this guide https://docs.databricks.com/dev-tools/cli/index.html , but there's no response when executing below command, it looks like exist the command directly without saying exporting is successfully or failed.
I have finished install the cli and setup authentication by entering a host and token in mac terminal by following the guide as well.
export DATABRICKS_CONFIG_FILE="dbfs:/FileStore/tables/partition.csv"
please refer to this screenshot: At first, I write the dataframe into file system by below code
df.coalesce(1).write.mode("overwrite").csv("dbfs:/FileStore/tables/partition.csv")
how could i successfully export the file from databricks and where does it stored locally?
Yes, you can copy to your local machine or move to another destination as needed
Configure azure CLI with azure databricks:
Please follow this steps:
pip install databricks-cli
Use databricks configure --token
command
Mention Azure databricks host name : https://adb-xxxxx.azuredatabricks.net/
Past your Personal Access Token .
Now all set to export the CSV file and store it in a destination location.
databricks fs cp dbfs:/FileStore/tables/partition.csv dbfs:/destination/your_folder/file.csv
databricks fs cp C:/folder/file.csv dbfs:/FileStore/folder
Or
If you have a lot of CSV files placed in a folder.you prefer to export the entire folder rather than individual files.
Use -r
to select your folder instead of the individual file.
databricks fs cp -r dbfs:/<folder> destination/folder
Alternative approach in python:
You can use directly dbutils.fs.cp("dbfs:/FileStore/gender_submission.csv","destination/folder")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.