简体   繁体   中英

Spark Streaming failed executor tasks

When i look at Jobs tab on spark UI, i can see Tasks status like 20/20/ (4 failed).

Does it mean there is data loss on failed tasks? Aren't those failed tasks moved to a diff executor?

在此输入图像描述

While you should be wary of failing tasks (they are frequently an indicator of an underlying memory issue), you need not worry about data loss. The stages have been marked as successfully completed, so the tasks that failed were in fact (eventually) successfully processed.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM