简体   繁体   English

使用Maven的Apache Spark错误

[英]Apache spark error using Maven

I'm currently trying to learn Apache Spark. 我目前正在尝试学习Apache Spark。 I'm using Maven and Java, as I'm not familiar with Scala or SBT. 我正在使用Maven和Java,因为我对Scala或SBT不熟悉。

Also, all the examples I tried seem to have the "pom.xml" file with over 2000 lines. 此外,我尝试的所有示例似乎都包含超过2000行的“ pom.xml”文件。 Are that many dependencies required? 是否需要许多依赖项?

Here's the example I tried. 这是我尝试的示例。 Here's my current "pom.xml" file. 这是我当前的“ pom.xml”文件。 This is the error I'm getting: 这是我得到的错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at org.apache.spark.sql.SparkSession$Builder.config(SparkSession.scala:666)
at org.apache.spark.sql.SparkSession$Builder.appName(SparkSession.scala:657)
at misc.apache2.main(apache2.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

Process finished with exit code 1

Also, if I'm using a Maven project for Apache Spark, do I need to have a build.sbt file? 另外,如果我将Maven项目用于Apache Spark,是否需要一个build.sbt文件?

Currently running with IntelliJ 16.1.3, Windows 10. 当前与IntelliJ 16.1.3,Windows 10一起运行。

As for what I see, you are using different versions of Spark. 就我所见,您正在使用不同版本的Spark。

From your pom.xml , you are using version 1.2.0 of Spark Core, version 2.0.0 of Spark Sql and version 1.6.1 of Spark Hive. pom.xml ,您正在使用Spark Core 1.2.0版,Spark Sql 2.0.0版和Spark Hive 1.6.1版。

Try using the same version of Spark for all Spark dependencies: 尝试对所有Spark依赖项使用相同版本的Spark:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.0.0-preview</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.0.0-preview</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.11</artifactId>
    <version>2.0.0-preview</version>
</dependency>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM