简体   繁体   English

数据流作业失败并且不显示任何日志

[英]Dataflow jobs failing and showing no logs

I created pipelines in Dataflow using the standard template JDBC to BigQuery and there are a few jobs that are unexpectedly failing and not showing any logs.我使用标准模板JDBC to BigQuery的管道,并且有一些作业意外失败并且没有显示任何日志。

The thing is, when a job fails because of the resources, the job needed more vCPUs than was avaliable in the region or the memory was not enough for example, these kind of errors are displayed in the logs, as you can see below.问题是,当作业因资源而失败时,作业需要的 vCPU 数量超过了该区域可用的 vCPU,或者 memory 不够,例如,这些错误会显示在日志中,如下所示。

在此处输入图像描述

But some jobs just fail with no logs and the resources are sufficient.但是有些工作只是失败了,没有日志,而且资源足够。

在此处输入图像描述

Does anyone know how to find the logs in this case?有谁知道在这种情况下如何找到日志?

Change the severity of the logs.更改日志的严重性。 If you choose Default , you should see more logs.如果您选择Default ,您应该会看到更多日志。 For how the job page looks like for that failed job, I would say you are probably going to need to have a look at the worker logs as well.对于该失败作业的作业页面的外观,我想说您可能还需要查看工作日志。

Depending on the error, the Diagnostics tab may have some summarized info of what kind error has made the job fail.根据错误,“诊断”选项卡可能包含一些关于哪种错误导致作业失败的汇总信息。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM