I have a spark program (scala).
I run command assembly to get a jar.
I put the jar on hdfs
cluster (so the jar is physically on the server)
I run the command jar -tvf
to be sure my main class is in the jar (and it is: com/mycompany/MyMainClass.class
I try to run the following command on a server of my cluster ./hadoop/spark/bin/spark-submit --class com.mycompany.MyMainClass --master yarn project.jar
and I have this error:
Failed to load com.mycompany.MyMainClass.
java.lang.ClassNotFoundException: com.mycompany.MyMainClass
Can someone help me?
It was because of assembly in sbt for scala, this is the correct mergeStrategy for me
`assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case _ => MergeStrategy.first
}`
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.