[英]Spark YARN Cluster mode get this error “Could not find or load main class org.apache.spark.deploy.yarn.ApplicationMaster”
I have this job running fine in YARN client mode, however in Cluster mode I get the following error. 我的这项工作在YARN客户端模式下运行良好,但是在群集模式下,出现以下错误。
Log Contents: Error: Could not find or load main class org.apache.spark.deploy.yarn.ApplicationMaster End of LogType:stderr 日志内容:错误:找不到或加载主类org.apache.spark.deploy.yarn.ApplicationMaster LogType结尾:stderr
I have not set spark.yarn.jars or the spark.yarn.archive. 我尚未设置spark.yarn.jars或spark.yarn.archive。 However in the trace, I do see the spark-yarn jar getting uploaded. 但是,在跟踪中,我确实看到火花纱罐正在上载。 Is there any additional setting needed here ? 这里是否需要其他设置?
16/11/01 10:49:49 INFO yarn.Client: Uploading resource file:/etc/security/keytabs/spark.keytab -> hdfs://beixvz579:8020/user/sifsuser/.sparkStaging/application_1477668405073_0026/spark.keytab 16/11/01 10:49:50 INFO yarn.Client: Uploading resource file:/home/sifsuser/spark200/jars/spark-yarn_2.11-2.0.0.jar -> hdfs://beixvz579:8020/user/sifsuser/.sparkStaging/application_1477668405073_0026/spark-yarn_2.11-2.0.0.jar 16/11/01 10:49:50 INFO yarn.Client: Uploading resource file:/home/sifsuser/lib/sparkprogs.jar -> hdfs://beixvz579:8020/user/sifsuser/.sparkStaging/application_1477668405073_0026/sparkprogs.jar 16/11/01 10:49:49 INFO yarn.Client:正在上载资源文件:/etc/security/keytabs/spark.keytab-> hdfs:// beixvz579:8020 / user / sifsuser / .sparkStaging / application_1477668405073_0026 / spark。 keytab 16/11/01 10:49:50 INFO yarn.Client:上传资源文件:/home/sifsuser/spark200/jars/spark-yarn_2.11-2.0.0.jar-> hdfs:// beixvz579:8020 / user / sifsuser / .sparkStaging / application_1477668405073_0026 / spark-yarn_2.11-2.0.0.jar 16/11/01 10:49:50 INFO yarn.Client:上传资源文件:/home/sifsuser/lib/sparkprogs.jar- > hdfs:// beixvz579:8020 / user / sifsuser / .sparkStaging / application_1477668405073_0026 / sparkprogs.jar
The jar is spark-yarn_2.11-2.4.0.jar(my version) which location is $SPARK_HOME/jars/ 该罐子是spark-yarn_2.11-2.4.0.jar(我的版本),其位置是$ SPARK_HOME / jars /
first step: (add this into spark-default.conf) 第一步:(将其添加到spark-default.conf中)
spark.yarn.jars hdfs://hadoop-node1:9000/spark/jars/*
second step: 第二步:
hadoop fs -put $SPARK_HOME/jars/* hdfs://hadoop-node1:9000/spark/jars/
After a lot of debugging, I found out this error was thrown due to a missing class that the ApplicationMaster was dependent on. 经过大量调试后,我发现此错误是由于缺少ApplicationMaster依赖的类而引发的。 In my case it was one of the logging jars that AM class is dependent on. 在我的情况下,这是AM类所依赖的日志记录jar之一。 After adding the additional jars, I can now submit jobs. 添加其他jar之后,我现在可以提交作业。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.