简体   繁体   中英

How to log errors in dataflow adf of parallel sources

I have to do some data engineering by reading manifest.cdm.json files from datalake. add pipeline run id column and push to sql database.

I have one json list file which have required parameter to read CDM json file in source of dataflow.

Previous Approach : I used Foreach and passed parameter to dataflow with single activity then error capturing. But use of Dataflow with for each costs too much..

Current Approch : I mannually created Dataflow with all cdm files. But here I'm not able to capture error. If any source got error all dataflow activity fails. and If I select skip error in dataflow activity I'm not getting any error.

So what should be the approch to get errors from current approch.

天蓝色数据流

You can capture the error using set variable activity in Azure Data Factory.

Use below expression to capture the error message using Set Variable activity:

@activity('Data Flow1').Error.message

在此处输入图像描述

Later you can store the error message in blob storage for future reference using copy activity . In below example we are saving error message in .csv file using DelimitedText dataset.

在此处输入图像描述

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM