简体   繁体   English

使用 Databricks /mnt 安装 Azure Data lake Gen2

[英]Mounting Azure Data lake Gen2 with Databricks /mnt

I have mounted my Azure Data lake Gen2 with Databricks and the mount was done successfully.我已经使用 Databricks 安装了我的 Azure Data lake Gen2 并且安装成功完成。 The problem is that I got this error ( FileNotFoundException: / is not found ).问题是我收到了这个错误(FileNotFoundException: / is not found)。 This is my command using Scala to mount.这是我使用 Scala 挂载的命令。

**###
dbutils.fs.mount(
  source = "wasbs://XXXXXXXX@YYYYYYYYYYYY.blob.core.windows.net", 
  mountPoint = "/mnt/",
  extraConfigs = Map("fs.azure.sas.XXXXXXXX.YYYYYYYYYYYY.blob.core.windows.net" -> dbutils.secrets.get(scope = "SCOPENAME", key = "SCOPEKEY")))
###**

I have created the Scope secrets in data bricks and linked them with Blob storage in Azure.我在数据块中创建了 Scope 秘密,并将它们与 Azure 中的 Blob 存储链接起来。

I am also executing this command:我也在执行这个命令:

dbutils.secrets.get(scope = "SCOPENAME", key = "SCOPEKEY") 

and I am getting that they are mounted: String = [REDACTED]我知道它们已安装:String = [已编辑]

What was something wrong I did?我做错了什么? Can anyone help?谁能帮忙?

Please check the location based on the following comments.请根据以下评论检查位置。 For DBFS location understanding link对于 DBFS 位置理解链接

%fs ls /mnt 

or或者

%sh ls /dbfs/tmp/

or或者

dbutils.fs.ls ("/mnt/")

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 Azure Databricks 和 Terraform 中安装带有 AAD 直通的 ADLS gen2 - Mounting ADLS gen2 with AAD passthrough in Azure Databricks with Terraform 在 Databricks 上使用 Pyspark 访问 Azure ADLS gen2 - Accessing Azure ADLS gen2 with Pyspark on Databricks 使用 Elastic Stack 对驻留在 Azure Data Lake Storage Gen2 中的数据进行实时数据分析 - Realtime data analytics using Elastic Stack on data residing in Azure Data Lake Storage Gen2 Azure Data Lake Gen2 存储帐户 blob 与 adf 选择 - Azure Data Lake Gen2 Storage Account blob vs adf choice 将数据从本地 sql 服务器复制到 Azure Data Lake Storage Gen2 中的增量格式 - copy data from on premise sql server to delta format in Azure Data Lake Storage Gen2 获取列表中数据湖 gen2 文件夹的所有内容 azure 突触工作区 - get all the contents of data lake gen2 folder in a list azure synapse workspace 用于解析 Azure Data Lake Storage Gen2 URI 的正则表达式,用于使用 Azurite 进行生产和测试 - Regex to parse Azure Data Lake Storage Gen2 URI for production and testing with Azurite 使用 Azure 数据工厂数据流将 CSV 文件下沉到 Azure Data Lake Gen2 时如何删除额外文件? - How to remove extra files when sinking CSV files to Azure Data Lake Gen2 with Azure Data Factory data flow? 发送 Azure Iot 数据到 azure gen2 - Send Azure Iot data to azure gen2 如何使用 dbt 将镶木地板文件从 Azure Data Lake Gen2/Azure Blob 存储加载到专用池? - How to load parquet files from Azure Data Lake Gen2/Azure Blob Storage to Dedicated pool using dbt?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM