简体   繁体   中英

Azure Data Factory Pipeline Logs

Where does data logs of Azure Pipeline v2 gets stored, I would like to retrieve data of failed pipelines for specific date.( Dont want to use azure portal to view these data). Is there any table/view holds such datalogs from database.

据我所知,要获取诊断日志,您可以使用Azure MonitorOperations Management Suite (OMS)直观地监视这些管道。

By Azure Pipeline v2, you mean Azure Data Factory v2. Alert and Monitor data factories using Azure Monitor

Diagnostic logs:

  • Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using the diagnostic settings.
  • Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as PowerBI.
  • Analyze them with Log Analytics

The logs are stored on Azure Data Factory web server for 45 days. If you want to get the pipeline run and activity run metadata, you can use Azure Data Factory SDK to extract the information you need and save it somewhere you want.

Recommended approach on this for log term analysis as well as limiting access to a production data factory would be to configure logs to be sent to log analytics. Be sure to enable dedicated logging tables as this will help on the backend in terms of organizing your logs.

From there you can also set up alerts and access groups running off of log analytics queries for better monitoring.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM