简体   繁体   中英

Fail to run Java Spark on EMR

I got a simple Java Spark program running locally, but it failed on Amazon EMR. I tried both AMI 3.2.1 and AMI 3.3.1, all same errors. The failing code is at the JavaSparkContext below :

public static void main(String[] args) throws Exception {

    if (args.length < 1) {
        System.err.println("Usage: JavaWordCount <file>");
        System.exit(1);
    }

    SparkConf sparkConf = new SparkConf().setAppName("JavaWordCount");
    sparkConf.setMaster("local").set("spark.executor.memory", "1g");
    JavaSparkContext ctx = new JavaSparkContext(sparkConf);
        :

The error I got is:

   Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
        at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
        at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
        at akka.actor.RootActorPath.$div(ActorPath.scala:152)
        at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
        at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
        at scala.util.Try$.apply(Try.scala:191)
        at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
        at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
        at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
        at scala.util.Success.flatMap(Try.scala:230)
        at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
        at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:550)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
        at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
        at examples.JavaSparkProfile.main(JavaSparkProfile.java:54)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Does anyone have any idea?

Thanks a lot!

Check scala version. Select 2.10.*

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM