简体   繁体   English

Spark Java Scala错误

[英]Spark Java scala error

Hey I want to use spark in my Java Project : 嘿,我想在我的Java项目中使用spark:

I already add this dependency to my pom file : 我已经将此依赖项添加到我的pom文件中:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>1.4.0</version>
</dependency>

I tried this code : 我尝试了这段代码:

import org.apache.spark.api.java.JavaSparkContext;

public class sparkSQL {
    public void query() {
        JavaSparkContext sc = new JavaSparkContext();
    }
}

I called this function in my main but I got this error : 我在我的主程序中调用了此函数,但出现此错误:

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.spark.SparkContext.(SparkContext.scala:111) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:56) at realtimequeries.sparkSQL.query(sparkSQL.java:7) at main.ma 线程“主”中的异常java.lang.NoClassDefFoundError:scala / Cloneable在java.lang.ClassLoader.defineClass1(本机方法)在java.lang.ClassLoader.defineClass(ClassLoader.java:800)在java.security.SecureClassLoader.defineClass (SecureClassLoader.java:142)at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)at java.net.URLClassLoader.access $ 100(URLClassLoader.java:71)at java.net.URLClassLoader $ 1.run(URLClassLoader。 java.net.URLClassLoader上的java:361)$ URL.Load.run(URLClassLoader.java:355)上java.net.URLClassLoader.findClass(URLClassLoader.java:354)上的java.security.AccessController.doPrivileged(本机方法)。 org.apache.spark处的java.lang.ClassLoader.loadClass(ClassLoader.java:358)处的sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java:308)处的lang.ClassLoader.loadClass(ClassLoader.java:425) org.apache.spark.api.java.JavaSparkContext。(JavaSparkContext.scala:56)上的.SparkContext。(SparkContext.scala:111)在main.ma上的实时查询.sparkSQL.query(sparkSQL.java:7) in(main.java:25) Blockquote in(main.java:25)块引用

Caused by: java.lang.ClassNotFoundException: scala.Cloneable at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 16 more 造成原因:java.lang.ClassNotFoundException:scala.Cloneable at java.net.URLClassLoader $ 1.run(URLClassLoader.java:366)at java.net.URLClassLoader $ 1.run(URLClassLoader.java:355)at java.security.AccessController .doPrivileged(本机方法)(位于java.net.URLClassLoader.findClass(URLClassLoader.java:354)(位于java.lang.ClassLoader.loadClass(ClassLoader.java:425)),位于sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java) :308)at java.lang.ClassLoader.loadClass(ClassLoader.java:358)...还有16个

Blockquote 块引用

I don't understand why I got this error, because normally JavaSparkContext has been created for this utilisation : 我不明白为什么会收到此错误,因为通常为此用途创建了JavaSparkContext:

A Java-friendly version of SparkContext that returns JavaRDDs and works with Java collections instead of Scala ones. SparkContext的Java友好版本,它返回JavaRDD,并与Java集合(而非Scala集合)一起使用。

I already take a look at the pom of the spark-core_2.11 dependency I had and it's seems that we can find a scala dependency : 我已经看过了我拥有的spark-core_2.11依赖项的pom,看来我们可以找到一个scala依赖项:

http://central.maven.org/maven2/org/apache/spark/spark-hive_2.10/1.4.0/spark-hive_2.10-1.4.0.pom http://central.maven.org/maven2/org/apache/spark/spark-hive_2.10/1.4.0/spark-hive_2.10-1.4.0.pom

Did I miss something ? 我错过了什么 ? What I am doing wrong ? 我做错了什么? Thanks in advance 提前致谢

The class scala.Cloneable is present in scala-library*.jar. scala.Cloneable类存在于scala-library * .jar中。 This error is gone for me after adding scala-library to pom.xml 将scala库添加到pom.xml后,此错误对我而言不再存在

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.11.1</version>
</dependency>

Do not mix scala versions like 2.11 and 2.12 for different dependencies (make sure you are using the same scala version for all libraries). 不要为不同的依赖项混合使用2.11和2.12之类的scala版本(确保所有库都使用相同的scala版本)。

For eg spark-core_2.11 is build using 2.11 scala version. 例如,使用2.11 scala版本构建spark-core_2.11。 So the below would not work: 因此,以下内容将不起作用:

// would not work compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.4.4' compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.4.4' compile group: 'io.delta', name: 'delta-core_2.12', version: '0.4.0' //无效编译组:“ org.apache.spark”,名称:“ spark-core_2.11”,版本:“ 2.4.4”编译组:“ org.apache.spark”,名称:“ spark-sql_2 .11”,版本:“ 2.4.4”,编译组:“ io.delta”,名称:“ delta-core_2.12”,版本:“ 0.4.0”

// this would work; //这会起作用; note the change: 2.11 -> 2.12 compile group: 'org.apache.spark', name: 'spark-core_2.12', version: '2.4.4' compile group: 'org.apache.spark', name: 'spark-sql_2.12', version: '2.4.4' compile group: 'io.delta', name: 'delta-core_2.12', version: '0.4.0' 注意更改:2.11-> 2.12编译组:'org.apache.spark',名称:'spark-core_2.12',版本:'2.4.4'编译组:'org.apache.spark',名称:' spark-sql_2.12',版本:'2.4.4'编译组:'io.delta',名称:'delta-core_2.12',版本:'0.4.0'

You can use JavaSparkContext to work with Spark from Java, but you still need scala since Spark is written in scala. 您可以使用JavaSparkContext与Java中的Spark一起使用,但是由于Spark是用Scala编写的,因此您仍然需要scala。 Most of the operations are internally transformed to scala, or work internally with scala classes. 大多数操作都在内部转换为scala,或在内部使用scala类。 You can program everything in Java but you will still need scala in your classpath. 您可以使用Java编写所有程序,但在类路径中仍然需要scala。

So, in order to fix your error, you need to install scala and make SCALA_HOME point to the directory you installed it. 因此,为了纠正错误,您需要安装scala并使SCALA_HOME指向您安装它的目录。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM