简体   繁体   English

Apache Spark Maven POM错误

[英]Apache Spark Maven POM errors

I am trying out Apache Spark and have a little problem that I am stuck with. 我正在尝试Apache Spark,但遇到一个小问题。 I have this JavaSparkHiveExample.java example from the Apache Spark Github together with a pom.xml that I have created as a Maven project in IntelliJ IDEA. 我有一个来自Apache Spark Github的JavaSparkHiveExample.java示例,以及一个我在IntelliJ IDEA中作为Maven项目创建的pom.xml

I am able to run other Spark examples (using another simpler POM) but this one is giving me the following errors: 我可以运行其他Spark示例(使用另一个更简单的POM),但是给了我以下错误:

在此处输入图片说明

  • Scala is installed in Idea Scala安装在Idea中

I am new with Maven and would therefore appreciate some things I could try in order to solve this problem. 我是Maven的新手,因此希望能尝试一些解决此问题的方法。

Issue is with the value of $project.version. 问题在于$ project.version的值。 It is referring to your project' version (2.3.0-snapshot). 它指的是您项目的版本(2.3.0快照)。 There isn't any maven dependency with this version in Maven central repository, hence you are facing this issue. 在Maven中央存储库中,此版本没有任何Maven依赖关系,因此您面临此问题。 Instead of using the project version, you can add a new property like this and refer it for all dependency version 代替使用项目版本,您可以添加这样的新属性,并将其引用给所有依赖项版本

<properties>
 <spark.version>1.6.2</spark.version>
</properties>

and then use it in the dependency 然后在依赖项中使用它

<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_${scala.binary.version}</artifactId>
      <version>${spark.version}</version>
    </dependency>

Make sure the version you are using is available in maven repository https://mvnrepository.com/ 确保您正在使用的版本在maven存储库https://mvnrepository.com/中可用 在此处输入图片说明

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM