[英]Why does Spark exit with exitCode: 16?
我使用Spark 2.0.0和Hadoop 2.7並使用yarn-cluster模式。 每次,我都會收到以下錯誤:
17/01/04 11:18:04 INFO spark.SparkContext: Successfully stopped SparkContext
17/01/04 11:18:04 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)
17/01/04 11:18:04 INFO util.ShutdownHookManager: Shutdown hook called
17/01/04 11:18:04 INFO util.ShutdownHookManager: Deleting directory /tmp/hadoop-hduser/nm-local-dir/usercache/harry/appcache/application_1475261544699_0833/spark-42e40ac3-279f-4c3f-ab27-9999d20069b8
17/01/04 11:18:04 INFO spark.SparkContext: SparkContext already stopped.
但是,我確實得到了正確的打印輸出。 相同的代碼在Spark 1.4.0-Hadoop 2.4.0中工作正常,我沒有看到任何退出代碼。
如果應用程序退出錯誤,則無法清除此問題.sparkStaging https://issues.apache.org/jira/browse/SPARK-17340在Spark 1.4之后啟動(Affects Version / s:1.5.2,1.6.1,2.0.0)
問題是:當運行Spark(紗線,集群模式)和殺死應用程序時,不會清除.sparkStaging。
當這個問題發生在Spark 2.0.X中引發的exitCode 16時
ERROR ApplicationMaster: RECEIVED SIGNAL TERM
INFO ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)
你的代碼中有可能會殺死應用程序嗎? 如果是這樣 - 它不應該在Spark 1.4中看到,但應該在Spark 2.0.0中看到
請在代碼中搜索“退出”(就像你的代碼中有這樣的代碼一樣,錯誤不會在Spark 1.4中顯示,但會在Spark 2.0.0中顯示)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.