简体   繁体   中英

Error code DFExecutorUserError / Problem with loading a CSV file from Blob storage to an Azure SQL Database

I am having trouble with importing a CSV file to Azure SQl Database, it dives me the following error:

Error code: DFExecutorUserError; Failure type: User configuration issue; Details: Job failed due to reason: The given value of type NVARCHAR(360) from the data source cannot be converted to type nvarchar(50) of the specified target column. Source: Pipeline pipeline1; Data flow dataflow1; Monitor: Data flow activity Data flow1

The Dataflow consists of a source, derived column (where I convert the datatypes of a few columns from string to int and date) and a sink.

在此处输入图像描述

One of the Columns (Message) has a lot text on every row (most of all e-mails from customers) and on that column I have set varchar max in the Database.

Thanks in advance for the replies.

In my System I tried to reproduce the similar issue and also got the same error as you.

在此处输入图像描述

The main cause of error is when we move data from blob to SQL and we have already created table with small column size can trigger this issue. as we cannot enter value in column beyond its size.

To resolve this either at time of creation of table if you don't know the size of column then set it to max or ad the precopy script to sql alter column size to max and then run your pipeline.

ALTER TABLE table_name ALTER COLUMN column_name varchar(max)

在此处输入图像描述

Pipeline executed successfully.

在此处输入图像描述

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM