简体   繁体   中英

How do I copy a file from DBFS to REPOS on Databricks?

I'm currently working on moving a python .whl file that I have generated in dbfs to my repo located in /Workspace/Repos/My_Repo/My_DBFS_File , to commit the file to Azure DevOps.

As Databricks Repos is a Read Only location it does not permit me to programmatically copy the file to the Repo location. However, the UI provides various options to create or import files from various locations except dbfs .

Is there a workaround to actually move dbfs files to repos and then commit them to Azure DevOps?

The documentation says:

Databricks Runtime 11.2 or above.

In a Databricks Repo, you can programmatically create directories and create and append to files. This is useful for creating or modifying an environment specification file, writing output from notebooks, or writing output from execution of libraries, such as Tensorboard.

Using a Databricks cluster with Runtime 11.2 solved my issue

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM