[英]Writing Data to Azure Blob Storage from Azure Databricks
I was able to mount Azure Blob Container to my Databricks DBFS and was able to read the data as well.我能够将 Azure Blob 容器挂载到我的 Databricks DBFS,并且还能够读取数据。 While writing, I was able to see the files in the mount point from within databricks, however, it does not reflect in the blob storage.
在编写时,我能够从 databricks 中看到挂载点中的文件,但是,它不会反映在 blob 存储中。 Can someone help?
有人可以帮忙吗?
Are you saving the data frame?你在保存数据框吗?
df.write
.option("header", "true")
.format("com.databricks.spark.csv")
.save("/mnt/result/someData.csv")
Chances are that your path is incorrect.您的路径可能不正确。 Check the mounted path with
dbutils.fs.mounts()
and ensure it is in your saving path.使用
dbutils.fs.mounts()
检查安装路径并确保它在您的保存路径中。 Also check that your saving path starts with dbfs:/
and not /dbfs/
.还要检查您的保存路径是否以
dbfs:/
而不是/dbfs/
。 Don't hesitate to share your script.不要犹豫,分享您的脚本。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.