繁体   English   中英

在IntelliJ IDE中将Spark与Scala项目集成时出错

[英]Error on integrating Spark with Scala project in IntelliJ IDE

我在IntelliJ IDE中创建了一个简单的SBT项目,在build.sbt具有以下库依赖build.sbt

import _root_.sbt.Keys._

name := "untitled"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "1.5.1",
  "org.apache.spark" %% "spark-sql" % "1.5.1" ,
  "org.apache.spark" %% "spark-mllib"  % "1.5.1")

我们的目标是进口火花Spark和MLLIB,然后根据解释,以创建一个斯卡拉对象这里

但是,导入时发生以下错误:

 SBT project import [warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version: [warn] * 

org.scala-lang:scala-compiler:(2.11.0,2.11.7)[warn] * org.apache.commons:commons-lang3:(3.3.2,3.0)[warn] * jline:jline:(0.9 .94,2.12.1)[warn] * org.scala-lang.modules:scala-parser-combinators_2.11:(1.0.1,1.0.4)[warn] * org.scala-lang.modules:scala- xml_2.11:(1.0.1,1.0.4)[警告] * org.slf4j:slf4j-api:(1.7.10,1.7.2)[警告] [失败] net.sourceforge.f2j#arpack_combined_all; 0.1! arpack_combined_all.jar(src):(0毫秒)[警告] ====本地:尝试[警告] C:\\ Users \\ Cezar.ivy2 \\ local \\ net.sourceforge.f2j \\ arpack_combined_all \\ 0.1 \\ srcs \\ arpack_combined_all-sources。 jar [警告] ====公共:尝试[警告] https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-sources.jar [警告] [失败] javax .xml.bind#jsr173_api; 1.0!jsr173_api.jar(doc):(0ms)[warn] ==== local:tryd [warn] C:\\ Users \\ Cezar.ivy2 \\ local \\ javax.xml.bind \\ jsr173_api \\ 1.0 \\ docs \\ jsr173_api-javadoc.jar [警告] ====公共:尝试[警告] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1 .0-javadoc.jar [警告] [失败] javax.xml.bind#jsr173_api; 1.0!jsr173_api.jar(src):(0ms)[警告] ====本地:已尝试[警告] C:\\ Users \\ Cezar.ivy2 \\ local \\ javax.xml.bind \\ jsr173_api \\ 1.0 \\ srcs \\ jsr173_api-sources.jar [警告] ====公共:尝试过[警告] https://repo1.maven.org/maven2/javax/ xml / bind / jsr173_api / 1.0 / jsr173_api-1.0-sources.jar [warn] ::::::::::::::::::::::::::::::::: ::::::::::::::: [警告] ::失败的下载:: [警告] :: ^有关详细信息,请参见解析消息^ :: [警告] :::::::::: ::::::::::::::::::::::::::::::::::: [警告] :: net.sourceforge.f2j#arpack_combined_all; 0.1 !arpack_combined_all.jar(src)[warn] :: javax.xml.bind#jsr173_api; 1.0!jsr173_api.jar(doc)[warn] :: javax.xml.bind#jsr173_api; 1.0!jsr173_api.jar(src)[警告] ::::::::::::::::::::::::::::::::::::::::::::::::

Spark无法在Scala 2.11中使用。 它使用Scala 2.10,因此您需要使用兼容的Scala版本(请参阅http://spark.apache.org/docs/latest/ )。

另外,就像@eliasah在评论中提到的那样,您可以自己构建Spark。 有关如何构建Spark的说明,请参见http://spark.apache.org/docs/latest/building-spark.html。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM