简体   繁体   English

Spark Streaming执行程序任务失败

[英]Spark Streaming failed executor tasks

When i look at Jobs tab on spark UI, i can see Tasks status like 20/20/ (4 failed). 当我在spark UI上查看Jobs选项卡时,我可以看到任务状态,如20/20 /(4失败)。

Does it mean there is data loss on failed tasks? 这是否意味着失败的任务会丢失数据? Aren't those failed tasks moved to a diff executor? 那些失败的任务不是转移到差异执行者吗?

在此输入图像描述

While you should be wary of failing tasks (they are frequently an indicator of an underlying memory issue), you need not worry about data loss. 虽然您应该警惕失败的任务(它们经常是潜在内存问题的指示器),但您不必担心数据丢失。 The stages have been marked as successfully completed, so the tasks that failed were in fact (eventually) successfully processed. 这些阶段已标记为已成功完成,因此失败的任务实际上(最终)已成功处理。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM