简体   繁体   中英

Azure ADF DataFlow failed while writing rows

I have dataflow in Azure Data Factory which convert parquet files into csv. It works when I have 10 files with a size of 10 KB. But if I have 3 files with a size of 22 KB there is an error 'failed while writing rows'.

Do you know what is possible solution?

Provided information is not sufficient. You can try increasing core size and also look into below blog if it help you.

Click Here

Parquet file standard does not allow certain special characters. Use select transform or sink transform to remap these columns to different names. You can do it generically using pattern matching to rename all such columns also.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM