简体   繁体   中英

Spark on Yarn with Jdk8

I am running a spark job on hadoop yarn (hadoop 2.7.0 but also tried 2.4.0, all on my box using the downloads from apache-hadoop web site and spark 1.3.1). My spark job is in scala but contains classes compiled with jdk8.

When I run hadoop on jdk8, I get

INFO yarn.Client: 
 client token: N/A
 diagnostics: Shutdown hook called before final status was reported.
 ApplicationMaster host: kostas-pc
 ApplicationMaster RPC port: 0
 queue: default
 start time: 1431513335001
 final status: SUCCEEDED

Even if the job is marked as SUCCEEDED, it actually didn't do anything due to "Shutdown hook called before final status was reported.". In fact no logging is visible from my spark job.

When I switch the jdk that I run hadoop, to jdk7, my job starts running and I can see log entries from my scala code, but when it gets to the code compiled with jdk8 it fails with incompatible class error (as expected).

So it seems running hadoop+spark with jdk8 is not compatible. Are there any solutions to this?

Thanks

似乎火花1.4.0可以与JDK8一起使用

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM