简体   繁体   English

错误代码 DFExecutorUserError / 将 CSV 文件从 Blob 存储加载到 Azure SQL 数据库时出现问题

[英]Error code DFExecutorUserError / Problem with loading a CSV file from Blob storage to an Azure SQL Database

I am having trouble with importing a CSV file to Azure SQl Database, it dives me the following error:我在将 CSV 文件导入 Azure SQl 数据库时遇到问题,它使我出现以下错误:

Error code: DFExecutorUserError;错误代码:DFExecutorUserError; Failure type: User configuration issue;故障类型:用户配置问题; Details: Job failed due to reason: The given value of type NVARCHAR(360) from the data source cannot be converted to type nvarchar(50) of the specified target column.详细信息:作业失败,原因是:来自数据源的 NVARCHAR(360) 类型的给定值无法转换为指定目标列的 nvarchar(50) 类型。 Source: Pipeline pipeline1;资料来源:管道管道1; Data flow dataflow1;数据流dataflow1; Monitor: Data flow activity Data flow1监视器:数据流活动数据流1

The Dataflow consists of a source, derived column (where I convert the datatypes of a few columns from string to int and date) and a sink.数据流由一个源、派生列(我将一些列的数据类型从字符串转换为 int 和日期)和一个接收器组成。

在此处输入图像描述

One of the Columns (Message) has a lot text on every row (most of all e-mails from customers) and on that column I have set varchar max in the Database.其中一个列(消息)在每一行(大多数来自客户的所有电子邮件)上都有很多文本,并且在该列上我在数据库中设置了 varchar max。

Thanks in advance for the replies.提前感谢您的回复。

In my System I tried to reproduce the similar issue and also got the same error as you.在我的系统中,我尝试重现类似的问题,并且也遇到了与您相同的错误。

在此处输入图像描述

The main cause of error is when we move data from blob to SQL and we have already created table with small column size can trigger this issue.错误的主要原因是当我们将数据从 blob 移动到 SQL 时,我们已经创建了小列大小的表可以触发此问题。 as we cannot enter value in column beyond its size.因为我们不能在列中输入超出其大小的值。

To resolve this either at time of creation of table if you don't know the size of column then set it to max or ad the precopy script to sql alter column size to max and then run your pipeline.如果您不知道列的大小,请在创建表时解决此问题,然后将其设置为最大或将预复制脚本广告到 sql 将列大小更改为最大,然后运行您的管道。

ALTER TABLE table_name ALTER COLUMN column_name varchar(max)

在此处输入图像描述

Pipeline executed successfully.管道成功执行。

在此处输入图像描述

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在没有 Azure 数据工厂的情况下将 csv 文件从 blob 存储加载到 azure sql 数据库 - How to load csv file from blob storage to azure sql database without Azure Data Factory 如何使用 Airflow 将 CSV 文件从 Azure Data Lake/Blob 存储传输到 PostgreSQL 数据库 - How to transfer a CSV file from Azure Data Lake/Blob Storage to PostgreSQL database with Airflow 使用PySpark从Blob存储容器加载CSV文件 - Loading a CSV file from Blob Storage Container using PySpark 将 CSV 文件从 Azure blob 存储批量插入到 SQL 托管实例 - Bulk insert CSV file from Azure blob storage to SQL managed instance 将CSV文件上传到Azure BLOB存储 - Upload csv file to Azure BLOB Storage 在哪里托管数据摄取 ETL? 输入数据(csv 文件)自动从 Azure blob 存储到 Azure Posgresql - Where to host a data ingestion ETL ? input data (csv file) automatically from Azure blob storage to Azure Posgresql 如何在 Azure(.Net) 中导入 csv 文件(已上传到 blob 存储中) - How to import csv file( already uploaded in blob storage) in Azure(.Net) 从Excel读取存储在Azure Blob存储中的.csv - Reading .csv stored in Azure Blob Storage from Excel 从 Google Cloud Storage 加载 csv 文件时出现 BigQuery 错误 - BigQuery error when loading csv file from Google Cloud Storage 如何在 Azure Blob 存储中覆盖后命名 csv 文件 - How to name a csv file after overwriting in Azure Blob Storage
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM