简体   繁体   English

使用Jdk8在纱线上火花

[英]Spark on Yarn with Jdk8

I am running a spark job on hadoop yarn (hadoop 2.7.0 but also tried 2.4.0, all on my box using the downloads from apache-hadoop web site and spark 1.3.1). 我正在hadoop纱线上运行spark作业(hadoop 2.7.0,但也尝试了2.4.0,全部使用apache-hadoop网站上的下载和spark 1.3.1在我的盒子上进行)。 My spark job is in scala but contains classes compiled with jdk8. 我的火花工作在scala中,但包含使用jdk8编译的类。

When I run hadoop on jdk8, I get 当我在jdk8上运行hadoop时,我得到

INFO yarn.Client: 
 client token: N/A
 diagnostics: Shutdown hook called before final status was reported.
 ApplicationMaster host: kostas-pc
 ApplicationMaster RPC port: 0
 queue: default
 start time: 1431513335001
 final status: SUCCEEDED

Even if the job is marked as SUCCEEDED, it actually didn't do anything due to "Shutdown hook called before final status was reported.". 即使该作业被标记为“成功”,由于“在报告最终状态之前调用了“ Shutdown”挂接”,该作业实际上并未执行任何操作。 In fact no logging is visible from my spark job. 实际上,从我的Spark作业中看不到任何日志记录。

When I switch the jdk that I run hadoop, to jdk7, my job starts running and I can see log entries from my scala code, but when it gets to the code compiled with jdk8 it fails with incompatible class error (as expected). 当我将运行hadoop的jdk切换到jdk7时,我的工作开始运行,并且可以从scala代码中看到日志条目,但是当它进入使用jdk8编译的代码时,它将失败,并出现不兼容的类错误(如预期的那样)。

So it seems running hadoop+spark with jdk8 is not compatible. 因此,似乎与jdk8运行hadoop + spark不兼容。 Are there any solutions to this? 有什么解决办法吗?

Thanks 谢谢

似乎火花1.4.0可以与JDK8一起使用

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM