简体   繁体   English

使用帐户密钥访问 Azure Data Lake Storage Gen2

[英]Access Azure Data Lake Storage Gen2 using the account key

I'm trying to retrieve all paths to directories in Azure Data Lake Storage Gen2 using method specified in doc: https://docs.databricks.com/external-data/azure-storage.html , in particular I'm trying to access only using the storgae account key (not via Azure service principals or SAS tokens).我正在尝试使用文档中指定的方法检索 Azure Data Lake Storage Gen2 中目录的所有路径: https://docs.databricks.com/external-data/azure-storage.html ,特别是我正在尝试访问仅使用 storgae 帐户密钥(不是通过 Azure 服务主体或 SAS 令牌)。

The doc says the we just need to set this conf:文档说我们只需要设置这个 conf:

spark.conf.set(
    "fs.azure.account.key.<storage-account>.dfs.core.windows.net",
    dbutils.secrets.get(scope="<scope>", key="<storage-account-access-key>"))

But n.netheless I'm getting 403 permssion error:但是 n.netheless 我收到 403 permssion 错误:

ExecutionError: An error occurred while calling z:com.databricks.backend.daemon.dbutils.FSUtils.ls.
: Operation failed: "This request is not authorized to perform this operation using this permission.", 403, GET, https://<storage-account>.dfs.core.windows.net/<my-container>?upn=false&resource=filesystem&maxResults=5000&timeout=90&recursive=false, AuthorizationPermissionMismatch, "This request is not authorized to perform this operation using this permission. RequestId:ef2753bb-501f-00bf-3c6c-26d7f4000000 Time:2023-01-12T09:56:54.1924327Z"
    at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:248)

how come I'm still getting the error if I only use storage key?如果我只使用存储密钥,为什么我仍然会收到错误消息? I'm a bit confusing becasue using service principal I have to grant the client_id on the storage but in this case what I have to grant?我有点困惑,因为使用服务主体我必须在存储上授予 client_id 但在这种情况下我必须授予什么? I'm just using the key..我只是用钥匙..

any help is appreciated Thanks任何帮助表示赞赏谢谢

I tried to reproduce the same in my environment and got the below results我试图在我的环境中重现相同的内容并得到以下结果

Go to storage account -> IAM -> +Add Storage Blob Contributor. Go 到存储帐户 -> IAM -> +添加存储 Blob 贡献者。

在此处输入图像描述

Configure storage account in two ways.配置存储帐户有两种方式。

with key vault带钥匙库

spark.conf.set( "fs.azure.account.key.<storage_account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope_name>", key="<Access_key>"))

without key vault没有密钥库

spark.conf.set("fs.azure.account.key.<storage_account>.dfs.core.windows.net","Access_key")

在此处输入图像描述

for more information refer this SO thread.有关更多信息,请参阅此SO线程。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure Data Lake Gen2 与存储帐户 - Azure Data Lake Gen2 vs Storage account Azure 数据湖存储 Gen2 权限 - Azure Data Lake storage Gen2 permissions 如何使用 Python 从 Azure Data Lake Storage Gen2 中的事件中心访问捕获的数据 - How to access captured data from Event Hub in Azure Data Lake Storage Gen2 using Python 使用参数化脚本授予对 Azure Data Lake Gen2 的访问权限 - Grant access to Azure Data Lake Gen2 using a parameterized script Azure 的“Data Lake Storage Gen2”和“Data Lake Gen2”有什么区别? - What is the difference between Azure's "Data Lake Storage Gen2" and "Data Lake Gen2"? 如何检查创建的存储帐户 V2 是否在 Azure 中具有数据湖 gen2 属性? - How to check whether the storage account V2 created is having data lake gen2 property or not in Azure? Microsoft Azure Data Lake 存储 (Gen2) 中的分层命名空间是什么? - What is hierarchical namespace in Microsoft Azure Data Lake storage (Gen2)? Azure Data Lake Storage Gen2 创建目录(如果 python 中不存在) - Azure Data Lake Storage Gen2 create directory if not exists in python 如何使用 Python 从 Azure Data Lake Storage Gen2 检索所有目录路径? - How do I retrieve all directory paths from Azure Data Lake Storage Gen2 using Python? 无法使用 azure-sdk-for-js 使用 Angular 访问 Azure Data Lake Gen2 的文件系统 - Unable to access FileSystem of Azure Data Lake Gen2 with Angular using azure-sdk-for-js
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM