简体   繁体   English

我们如何在dbfs / filestore中保存或上传.py文件

[英]How can we save or upload .py file on dbfs/filestore

We have few .py files on my local needs to stored/saved on fileStore path on dbfs. 在本地需求上,我们很少有.py文件可以存储/保存在dbfs的fileStore路径上。 How can I achieve this? 我该如何实现?

Tried with dbUtils.fs module copy actions. 尝试使用dbUtils.fs模块复制操作。

I tried the below code but did not work, I know something is not right with my source path. 我尝试了下面的代码,但是没有用,我知道源路径不正确。 Or is there any better way of doing this? 还是有更好的方法呢? please advise 请指教

'''
dbUtils.fs.cp ("c:\\file.py", "dbfs/filestore/file.py")
'''

It sounds like you want to copy a file on local to the dbfs path of servers of Azure Databricks. 听起来您想将本地文件复制到Azure Databricks服务器的dbfs路径。 However, due to the interactive interface of Notebook of Azure Databricks based on browser, it could not directly operate the files on local by programming on cloud. 但是,由于基于浏览器的Azure Databricks笔记本的交互界面,它无法通过在云上编程直接在本地操作文件。

So the solutions as below that you can try. 因此,您可以尝试以下解决方案。

  1. As @Jon said in the comment, you can follow the offical document Databricks CLI to install the databricks CLI via Python tool command pip install databricks-cli on local and then copy a file to dbfs . 就像@Jon在评论中所说,您可以按照官方文档Databricks CLI通过Python工具命令pip install databricks-cli在本地pip install databricks-cli ,然后将文件复制到dbfs

  2. Follow the offical document Accessing Data to import data via Drop files into or browse to files in the Import & Explore Data box on the landing page , but also recommended to use CLI, as the figure below. 按照官方文档“ Accessing Data将“通过Drop files into or browse to files in the Import & Explore Data box on the landing page导入数据” Drop files into or browse to files in the Import & Explore Data box on the landing page ,但也建议使用CLI,如下图所示。

    在此处输入图片说明

  3. Upload your specified files to Azure Blob Storage, then follow the offical document Data sources / Azure Blob Storage to do the operations include dbutils.fs.cp . 将您指定的文件上传到Azure Blob存储,然后按照官方文档Data sources / Azure Blob Storage进行操作,包括dbutils.fs.cp

Hope it helps. 希望能帮助到你。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法在 DBFS 中保存文件 - Unable to save file in DBFS DBFS AZURE Databricks - 文件存储和 DBFS 的区别 - DBFS AZURE Databricks -difference in filestore and DBFS 将 excel 文件从 Databricks DBFS 上传到 SharePoint - Upload excel file from Databricks DBFS to SharePoint 如何将 dbfs 文件和文件夹上传到数据块中的 ADLS? - How to upload dbfs files and folders to ADLS in databricks? databricks - 在终端 'export DATABRICKS_CONFIG_FILE="dbfs:/FileStore/tables/partition.csv' 中执行命令时无响应 - databricks - no response when executing command in terminal 'export DATABRICKS_CONFIG_FILE="dbfs:/FileStore/tables/partition.csv' Azure - 如何从 Azure Databricks 文件存储下载文件? - Azure - How to dowload a file from Azure Databricks Filestore? 我们可以将 azure 函数日志保存到文件中吗? - Can we save azure functions logs to a file? .net core 3.1 一次可以上传多少大文件到 Azure 文件共享 - How much large file we can upload to Azure file share in a single shot in .net core 3.1 Databricks DBFS 文件浏览器不显示某些 DBFS 根位置 - Databricks DBFS File Browser not showing some DBFS root locations 我们可以将 cmd 中的 output 保存到 txt 文件或环境变量中吗 - Can we save the output in cmd into txt file or environment variables
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM