简体   繁体   English

需要一些帮助来在Java上为Cassandra设置Spark

[英]Need some help on setting up spark for cassandra on java

Setting up spark to access cassandra on java is throwing NoClassDefFoundError 在Java上设置火花以访问cassandra会引发NoClassDefFoundError

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(Unknown Source)
    at java.security.SecureClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.access$100(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at Client.main(Client.java:22)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    ... 13 more

Two jar files are added. 添加了两个jar文件。 spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar & spark-core_2.10-0.9.0-incubating.jar. spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar和spark-core_2.10-0.9.0-incubating.jar。 spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar is build against scala 2.10. spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar是针对scala 2.10构建的。 Typing scala -version on command prompt showing scala code runner version 2.11.6. 在命令提示符下键入scala -version,显示scala代码运行程序版本2.11.6。 Accessing spark from spark-shell have no issue. 从spark-shell访问spark没有问题。 Even access in cassandra column family from spark-shell is working fine. 甚至可以从spark-shell访问cassandra色谱柱系列。

import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import com.datastax.spark.connector.*;
import com.datastax.spark.connector.cql.*;
import com.datastax.spark.*;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
//import scala.Tuple2;
import org.apache.spark.api.java.*;

public class Client {
    public static void main(String[] a)
    {
        SparkConf conf = new SparkConf().setAppName("MTMPNLTesting").setMaster("192.168.1.15");
    }
}

What might be the reason of the error?? 错误的原因可能是什么?

Try also to include Scala Jar at your class path. 还要在您的课程路径中包括Scala Jar If you do not use Maven, download the jar and include it in the Project build properties. 如果不使用Maven,请下载jar,并将其包含在Project构建属性中。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM