简体   繁体   English

创建范围以从 Databricks 访问 Azure Datalake Gen2 时出现属性错误

[英]Attribute error while creating scope to access Azure Datalake Gen2 from Databricks

I was trying to setup using the scopes and I am having a few issues.我试图使用示波器进行设置,但遇到了一些问题。 Any help would be appreciated.任何帮助,将不胜感激。

I ran the below commands in Databricks CLI我在 Databricks CLI 中运行了以下命令

databricks secrets create-scope --scope dnb-dlg2-dbrcks-scp-stg
databricks secrets put --scope dnb-dlg2-dbrcks-scp-stg --key SPID --string-value "XXXXXXXXXXXXXXXXXX"
databricks secrets put --scope dnb-dlg2-dbrcks-scp-stg --key SPKey --string-value "XXXXXXXXXXXXXXX”
databricks secrets put --scope dnb-dlg2-dbrcks-scp-stg --key DirectoryID --string-value "XXXXXXXXXX"

Successfully created the scope.已成功创建范围。 Then I tried to run the below in my notebook然后我尝试在我的笔记本中运行以下内容

#Gather Relevant Keys from our scope

ServicePrincipalId=dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPID")
ServicePrincipalKey=dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPKey")
DirectoryID=dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="DirectoryID")

#Combine DirectoryID into full string
Directory="https://login.microsoftonline.com/{}/oauth2/token".format(DirectoryID)

#Create configurations for our connections
configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id" : ServicePrincipalId,
           "fs.azure.account.oauth2.client.secret":  ServicePrincipalKey,
           "fs.azure.account.oauth2.client.endpoint": Directory}
# "fs.azure.account.oauth2.client.secret" -> dbutils.secrets.get("dnb-dbrk-scrt-scp-stg", key = "dnb-data-bricks-kv-stg"),

# Mount the Data Lake onto DBFS at the /mnt/ location

dbutils.fs.mount(
  source = "abfss://datastore@dbstgstoraccgen2.dfs.core.windows.net/",
  mount_point = "/mnt/datastore5",
  extra_configs = configs)

I get an error at this point .Please refer to the image below此时出现错误。请参考下图

错误

ERROR DETAILS错误详情

    AttributeError: 
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<command-4345234368633882> in <module>
----> 1 dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPID")

/local_disk0/tmp/1575916741583-0/dbutils.py in __getattr__(self, item)
    482             return self.credentials
    483 
--> 484         raise AttributeError
    485 
    486     def __repr__(self):

AttributeError: 

Small typo mistake in your code: " secret " should be " secrets ".代码中的小错误:“ secret ”应该是“ secrets ”。

在此处输入图片说明

Error:错误:

ServicePrincipalId=dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPID")
ServicePrincipalKey=dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPKey")
DirectoryID=dbutils.secret.get(scope="dnb-dlg2-dbrcks-scp-stg",key="DirectoryID")

Replace " secret.get " with " secrets.get "“secrets.get”替换“secret.get”

ServicePrincipalId=dbutils.secrets.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPID")
ServicePrincipalKey=dbutils.secrets.get(scope="dnb-dlg2-dbrcks-scp-stg",key="SPKey")
DirectoryID=dbutils.secrets.get(scope="dnb-dlg2-dbrcks-scp-stg",key="DirectoryID")

在此处输入图片说明

Hope this helps.希望这可以帮助。 Do let us know if you any further queries.如果您有任何进一步的疑问,请告诉我们。


Do click on "Mark as Answer" and Upvote on the post that helps you, this can be beneficial to other community members.请点击“标记为答案”并在对您有帮助的帖子上点赞,这可能对其他社区成员有益。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从 Azure Databricks 中的 Azure Datalake Gen2 读取 .nc 文件 - Read .nc files from Azure Datalake Gen2 in Azure Databricks 我可以使用Python SDK访问来自Azure Datalake Gen2的数据吗? - Can I use Python SDK to access data from azure datalake gen2? 如何在不下载的情况下直接访问 Azure datalake gen2 中存在的 .txt 文件 - How can I access a .txt file which is present in Azure datalake gen2 directly without downloading Azure Function Python 写入 Azure DataLake Gen2 - Azure Function Python write to Azure DataLake Gen2 为什么 Databricks Python 不能从我的 Azure Datalake Storage Gen1 读取? - Why can't Databricks Python read from my Azure Datalake Storage Gen1? 如何从 pyspark 数据块在 ADLS gen2 中创建目录 - How to create directory in ADLS gen2 from pyspark databricks Azure 功能和 DataLake gen 2 连接 - Azure Functions and DataLake gen 2 connection 通过数据块从 ADLS gen2 存储中的多个文件夹中读取文件并创建单个目标文件 - Read files from multiple folders from ADLS gen2 storage via databricks and create single target file 从 Azure Dala Lake gen2 复制到 Azure Synapse 什么都不做 - COPY INTO from Azure Dala lake gen2 to Azure Synapse does nothing 使用Python或Java从本地将数据上传到Azure ADLS Gen2 - Upload data to the Azure ADLS Gen2 from on-premise using Python or Java
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM