简体   繁体   中英

Negative Active Tasks in Spark UI under load (Max number of executor failed)

在此处输入图片说明 I am running a spark streaming application on Spark 1.5.0 in CDH 5.5.0. In the logs I see max number of executor failed. I am unable to find the root cause.

We are getting this issue intermittently every other day.Final app status: FAILED, exitCode: 11, (reason: Max number of executor failures reached)

It's a bug, you can track changes in following tickets:

Edit: about this message "Max number of executors failed" - Spark have parameter spark.yarn.max.executor.failures . By default 2x number of executors, minimum 3. If there were more failures than it was set in this parameter, then application will be killed.

You can change value of this parameter. However I would be worried why you have so many executor failures - maybe you've got too less memory? Or bug in code? Without code and/or context information we are not able to help in investigation about potential bug

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM