简体   繁体   English

如何在Eclipse中向Maven项目添加Spark?

[英]How to add Spark to Maven project in Eclipse?

I would like to start Spark project in Eclipse using Maven. 我想使用Maven在Eclipse中启动Spark项目。 I've installed m2eclipse and I have a working HelloWorld Java application in my Maven project. 我安装了m2eclipse,我在Maven项目中有一个工作的HelloWorld Java应用程序。

I would like to use Spark framework and I'm following directions from the official site . 我想使用Spark框架,我正在遵循官方网站的指示。 I've added Spark repository to my pom.xml : 我已将Spark存储库添加到我的pom.xml

<repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
</repository>

And then the dependency: 然后依赖:

<dependency>
      <groupId>spark</groupId>
      <artifactId>spark</artifactId>
      <version>0.9.9.4-SNAPSHOT</version>
</dependency>

But I'm getting an error in Eclipse: 但我在Eclipse中遇到错误:

Missing artifact spark:spark:jar:0.9.9.4-SNAPSHOT

How can I resolve this issue? 我该如何解决这个问题? I don't want to download Spark's jar file and place in the local repository. 我不想下载Spark的jar文件并放在本地存储库中。

This is my pom.xml file: 这是我的pom.xml文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.myproject</groupId>
  <artifactId>Spark1</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>Spark1</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

  <repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
  </repository>

  <dependencies>
<!--     (...) -->

    <dependency>
      <groupId>spark</groupId>
      <artifactId>spark</artifactId>
      <version>0.9.9.4-SNAPSHOT</version>
    </dependency>

  </dependencies>

</project>

Currently no repository is required to add for Spark library loading 目前,不需要存储库来添加Spark库加载

You just need to add 你只需要添加

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.6.0</version>
</dependency>

And that's it. 就是这样。

Useful tutorials to play with is here 有用的教程就在这里

The repository block needs to be wrapped in a repositories block: repository块需要包装在repositories块中:

<repositories>
    <repository>
        <id>Spark repository</id>
        <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
    </repository>
</repositories>

Reason for failure is 0.9.9.4-SNAPSHOT is not available.Below is the list of the snapshots available. 失败的原因是0.9.9.4-SNAPSHOT不可用。下面是可用快照的列表。 Use one among them based on your requirement. 根据您的要求使用其中一个。

0.9.8-SNAPSHOT/ Sat May 21 21:54:23 UTC 2011 0.9.8-SNAPSHOT /星期六5月21日21:54:23 UTC 2011
0.9.9-SNAPSHOT/ Mon May 23 10:57:38 UTC 2011 0.9.9-SNAPSHOT /星期一5月23日10:57:38 UTC 2011
0.9.9.1-SNAPSHOT/ Thu May 26 09:47:03 UTC 2011 0.9.9.1-SNAPSHOT / Thu May 26 09:47:03 UTC 2011
0.9.9.3-SNAPSHOT/ Thu Sep 01 07:53:59 UTC 2011 0.9.9.3-SNAPSHOT / Thu Sep 01 07:53:59 UTC 2011

Thanks, Sankara Reddy 谢谢,Sankara Reddy

I had run into same issue because initially I started with different repository url for spark and then to use earlier version I changed the repository url. 我遇到了同样的问题,因为最初我开始使用不同的存储库url进行spark,然后使用早期版本我更改了存储库url。 Some how it didn't seem to come into effect until I changed the repository id. 在我更改存储库ID之前,它似乎没有生效。 Try changing the repository id. 尝试更改存储库ID。
Could be bug in maven because running maven from console also couldn't resolve the dependency without updating the id. 可能是maven中的bug,因为从控制台运行maven也无法在不更新id的情况下解析依赖关系。

please add the repository , tag inside the repositories tag like below 请在存储库标记内添加存储库,标记,如下所示

<repositories>
        <repository>
            <id>Spark repository</id>
            <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
        </repository>
    </repositories>

The last versions (2.1 and later) of Spark only need the dependency defined inside the pom.xml file Spark的最新版本(2.1及更高版本)只需要在pom.xml文件中定义的依赖项

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.1</version>
</dependency>

the repository definition is not longer required 不再需要存储库定义

use this latest repository. 使用这个最新的存储库 http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10/1.6.0 http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10/1.6.0

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0</version>
</dependency>

use this, and also make sure you change spark library to version 2.11.x in eclipse project build path 使用它,并确保在eclipse项目构建路径中将spark库更改为2.11.x版

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.spark-scala</groupId>
    <artifactId>spark-scala</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>${project.artifactId}</name>
    <description>Spark in Scala</description>
    <inceptionYear>2010</inceptionYear>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.tools.version>2.10</scala.tools.version>
        <!-- Put the Scala version of the cluster -->
        <scala.version>2.10.4</scala.version>
    </properties>

    <!-- repository to add org.apache.spark -->
    <repositories>
        <repository>
            <id>cloudera-repo-releases</id>
            <url>https://repository.cloudera.com/artifactory/repo/</url>
        </repository>
    </repositories>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.1</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.2.1</version>
        </dependency>
    </dependencies>
</project>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM