[英]Access Azure Data Lake Storage Gen2 using the account key
I'm trying to retrieve all paths to directories in Azure Data Lake Storage Gen2 using method specified in doc: https://docs.databricks.com/external-data/azure-storage.html , in particular I'm trying to access only using the storgae account key (not via Azure service principals or SAS tokens).我正在尝试使用文档中指定的方法检索 Azure Data Lake Storage Gen2 中目录的所有路径: https://docs.databricks.com/external-data/azure-storage.html ,特别是我正在尝试访问仅使用 storgae 帐户密钥(不是通过 Azure 服务主体或 SAS 令牌)。
The doc says the we just need to set this conf:文档说我们只需要设置这个 conf:
spark.conf.set(
"fs.azure.account.key.<storage-account>.dfs.core.windows.net",
dbutils.secrets.get(scope="<scope>", key="<storage-account-access-key>"))
But n.netheless I'm getting 403 permssion error:但是 n.netheless 我收到 403 permssion 错误:
ExecutionError: An error occurred while calling z:com.databricks.backend.daemon.dbutils.FSUtils.ls.
: Operation failed: "This request is not authorized to perform this operation using this permission.", 403, GET, https://<storage-account>.dfs.core.windows.net/<my-container>?upn=false&resource=filesystem&maxResults=5000&timeout=90&recursive=false, AuthorizationPermissionMismatch, "This request is not authorized to perform this operation using this permission. RequestId:ef2753bb-501f-00bf-3c6c-26d7f4000000 Time:2023-01-12T09:56:54.1924327Z"
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:248)
how come I'm still getting the error if I only use storage key?如果我只使用存储密钥,为什么我仍然会收到错误消息? I'm a bit confusing becasue using service principal I have to grant the client_id on the storage but in this case what I have to grant?
我有点困惑,因为使用服务主体我必须在存储上授予 client_id 但在这种情况下我必须授予什么? I'm just using the key..
我只是用钥匙..
any help is appreciated Thanks任何帮助表示赞赏谢谢
I tried to reproduce the same in my environment and got the below results我试图在我的环境中重现相同的内容并得到以下结果
Go to storage account -> IAM -> +Add Storage Blob Contributor. Go 到存储帐户 -> IAM -> +添加存储 Blob 贡献者。
Configure storage account in two ways.
配置存储帐户有两种方式。
with key vault带钥匙库
spark.conf.set( "fs.azure.account.key.<storage_account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope_name>", key="<Access_key>"))
without key vault没有密钥库
spark.conf.set("fs.azure.account.key.<storage_account>.dfs.core.windows.net","Access_key")
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.