繁体   English   中英

无法使用 azure sdk python 为 azure 数据工厂创建数据集

[英]Fail to create dataset using azure sdk python for azure data factory

我正在尝试使用 azure sdk for python 在 ADF 中创建数据集,不幸的是我遇到了此错误消息。 我不确定我下面的代码有什么问题。

dsOut_name = 'POC_DatasetName'
ds_ls ="AzureBlobStorage"
output_blobpath = '/tempdir'
df_name = 'pipeline1'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print_item(dsOut)
Error Message: SerializationError: Unable to build a model: Unable to deserialize to object: type, AttributeError: 'str' object has no attribute 'get', DeserializationError: Unable to deserialize to object: type, AttributeError: 'str' object has no attribute 'get'

请帮忙

我可以重现您的问题,这一行ds_ls ="AzureBlobStorage"是错误的,应该是ds_ls = LinkedServiceReference(reference_name=ls_name)

在此处输入图片说明

您可以参考我的完整工作示例。

确保您的服务主体在数据工厂的Access control (IAM)中具有 RBAC 角色(例如OwnerContributor ),并且您已完成所有先决条件

我的包版本:

azure-mgmt-datafactory  0.6.0
azure-mgmt-resource  3.1.0
azure-common  1.1.23

代码:

from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.datafactory import DataFactoryManagementClient
from azure.mgmt.datafactory.models import *


subscription_id = '<subscription-id>'
ls_name = 'storageLinkedService'
rg_name = '<group-name>'
df_name = '<datafactory-name>'

credentials = ServicePrincipalCredentials(client_id='<client id of the service principal>',
                                          secret='<secret of the service principal>', tenant='<tenant-id>')
resource_client = ResourceManagementClient(credentials, subscription_id)
adf_client = DataFactoryManagementClient(credentials, subscription_id)


storage_string = SecureString('DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=<storage account key>')

ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)

ds_ls = LinkedServiceReference(reference_name=ls_name)


# Create an Azure blob dataset (output)
dsOut_name = 'ds_out'
output_blobpath = '<container name>/<folder name>'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print(dsOut)

在此处输入图片说明

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM