简体   繁体   English

Azure ADF V2 - 完成时的活动

[英]Azure ADF V2 - Activity on completion

I'm working on creating a small project to track ETL logs.我正在创建一个小项目来跟踪 ETL 日志。 I've created a stored procedure with parameters and a custom SQL table to load the ETL logs.我创建了一个带有参数和自定义 SQL 表的存储过程来加载 ETL 日志。

Inside the ADF I have multiple activities.在 ADF 中,我有多个活动。 At the end I'm using stored procedure activity with parameters mapped to ADF system variables like pipeline name, error details etc to log in the SQL table.最后,我使用带有映射到 ADF 系统变量(如管道名称、错误详细信息等)的参数的存储过程活动来登录 SQL 表。

Issue : whenever there's an error on an activity in middle, the pipeline fails and not touching the stored procedure activity.问题:每当中间的活动出现错误时,管道就会失败并且不会触及存储过程活动。 Like, say I have Copy1, Copy2, Copy3 and at last ETLLog_StoredProcedure .就像,假设我有 Copy1、Copy2、Copy3 和最后ETLLog_StoredProcedure If Copy2 fails, the pipeline run stops there at Copy2 and stored procedure activity is not run.如果 Copy2 失败,管道运行会在 Copy2 处停止,并且不会运行存储过程活动。

I have connected all the Copy activities to ETLLog_StoredProcedure using Activity-On-Completion connections.我已使用 Activity-On-Completion 连接将所有 Copy 活动连接到ETLLog_StoredProcedure Take a look at the picture below.看看下面的图片。

Expectation : I need to call the stored procedure activity even if the pipeline fails/succeeds so that I can log the status of the pipeline.期望:即使管道失败/成功,我也需要调用存储过程活动,以便我可以记录管道的状态。

示例 ADF ETLLog

Data factory dependencies are used as an AND condition.数据工厂依赖项用作 AND 条件。 This means that the stored procedure will be run once ALL of the 3 activities are "completed" (success or failure).这意味着一旦所有 3 个活动都“完成”(成功或失败),将运行存储过程。 But in your scenario, the second activity is failing and the third one is never running (not even failing) and that's why the Stored Procedure activity is not running.但在您的场景中,第二个活动失败,第三个活动从未运行(甚至没有失败),这就是存储过程活动未运行的原因。

You can achieve what you are looking for with this, changing the parameters for the stored procedure depending where the failure happened (or to show a success in the last one), for example:您可以通过此实现您正在寻找的内容,根据失败发生的位置(或显示最后一个成功)更改存储过程的参数,例如:

在 ADF 中记录错误

There are other ways to achieve this, but they require a bit more understanding of ADF variables and functions, this one is the simplest in my opinion.还有其他方法可以实现这一点,但它们需要对 ADF 变量和函数有更多的了解,我认为这是最简单的。

Hope this helped!!希望这有帮助!!

I added the picture for better understanding我添加了图片以便更好地理解ADFPicturePipelineRun

To have only one Stored Procedure call in the pipeline you can just add the option "Skipped".要在管道中只有一个存储过程调用,您只需添加选项“跳过”。

So in general the Activity "Copy data3" has 2 options to full fill the condition to execute Activity "Stored procedure1", Completion OR Skipped.因此,通常活动“复制数据3”有2个选项来完全填充执行活动“存储过程1”的条件,完成或跳过。 As "Copy data1" and "Copy data2" both completed and "Copy data3" Skipped "Stored procedure1" is executed.当“复制数据1”和“复制数据2”都完成并且“复制数据3”被跳过时,“存储过程1”被执行。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure 数据工厂复制活动无法将 MongoDB ISODate 转换为 DateTime v2 连接器 - Azure Data Factory Copy Activity fails converting MongoDB ISODate to DateTime v2 connector Azure ADF 在查找活动中使用 concat 构建查询 - Azure ADF build query using concat in lookup activity Data Factory v2:Data Lake增量复制活动 - Data Factory v2: Data Lake incremental copy activity 在Data Factory V2中读取Azure Blob的容器名称 - Read container names of an Azure blob in Data Factory V2 启用列加密 Azure Function App v2 错误 - Enable Column Encryption Azure Function App v2 Error 如果按小时计划运行Azure数据工厂v2,则在第一次运行时会触发管道的新实例 - New instance of pipeline is triggered when first is running in case of hourly schedule Azure data factory v2 SQL Server Reporting Services (SSRS) Web 门户不适用于 Azure 应用程序网关 v2 - SQL Server Reporting Services (SSRS) web portal not working with Azure Application Gateway v2 sql:BETWEEN v1和v2 - sql: BETWEEN v1 AND v2 Azure 数据工厂 V2 复制数据问题 - 错误代码:2200 已添加具有相同密钥的项目 - Azure data factory V2 copy data issue - error code: 2200 An item with the same key has already been added 如何在 ADF 中存储过程活动的动态内容中给出条件 - How to give condition in dynamic content of stored procedure activity in ADF
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM