[英]Error code DFExecutorUserError / Problem with loading a CSV file from Blob storage to an Azure SQL Database
I am having trouble with importing a CSV file to Azure SQl Database, it dives me the following error:我在将 CSV 文件导入 Azure SQl 数据库时遇到问题,它使我出现以下错误:
Error code: DFExecutorUserError;错误代码:DFExecutorUserError; Failure type: User configuration issue;
故障类型:用户配置问题; Details: Job failed due to reason: The given value of type NVARCHAR(360) from the data source cannot be converted to type nvarchar(50) of the specified target column.
详细信息:作业失败,原因是:来自数据源的 NVARCHAR(360) 类型的给定值无法转换为指定目标列的 nvarchar(50) 类型。 Source: Pipeline pipeline1;
资料来源:管道管道1; Data flow dataflow1;
数据流dataflow1; Monitor: Data flow activity Data flow1
监视器:数据流活动数据流1
The Dataflow consists of a source, derived column (where I convert the datatypes of a few columns from string to int and date) and a sink.数据流由一个源、派生列(我将一些列的数据类型从字符串转换为 int 和日期)和一个接收器组成。
One of the Columns (Message) has a lot text on every row (most of all e-mails from customers) and on that column I have set varchar max in the Database.其中一个列(消息)在每一行(大多数来自客户的所有电子邮件)上都有很多文本,并且在该列上我在数据库中设置了 varchar max。
Thanks in advance for the replies.提前感谢您的回复。
In my System I tried to reproduce the similar issue and also got the same error as you.在我的系统中,我尝试重现类似的问题,并且也遇到了与您相同的错误。
The main cause of error is when we move data from blob to SQL and we have already created table with small column size can trigger this issue.错误的主要原因是当我们将数据从 blob 移动到 SQL 时,我们已经创建了小列大小的表可以触发此问题。 as we cannot enter value in column beyond its size.
因为我们不能在列中输入超出其大小的值。
To resolve this either at time of creation of table if you don't know the size of column then set it to max or ad the precopy script to sql alter column size to max and then run your pipeline.如果您不知道列的大小,请在创建表时解决此问题,然后将其设置为最大或将预复制脚本广告到 sql 将列大小更改为最大,然后运行您的管道。
ALTER TABLE table_name ALTER COLUMN column_name varchar(max)
Pipeline executed successfully.管道成功执行。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.