简体   繁体   English

spark sbt编译错误库

[英]spark sbt compile error libraryDependencies

1.2.0-bin-hadoop2.4 and my Scala version is 2.11.7 . 1.2.0-bin-hadoop2.4和我的Scala版本是2.11.7 I am getting an error so I can't use sbt. 我遇到错误,因此无法使用sbt。

~/sparksample$ sbt

Starting sbt: invoke with -help for other options [info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)

> sbt compile

[info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11.7/1.2.0/spark-core_2.11.7-1.2.0.pom [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.11.7;1.2.0: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.7;1.2.0: not found [error] Total time: 2 s, completed Oct 15, 2015 11:30:47 AM

Any suggestions? 有什么建议么? Thanks 谢谢

There exists no spark-core_2.11.7 jar file. 不存在spark-core_2.11.7 jar文件。 You have to get rid of the maintenance version number .7 in the spark dependencies because spark-core_2.11 exists. 您必须在spark依赖项中删除维护版本号.7 ,因为存在spark-core_2.11 All Scala versions with version 2.11 should be compatible. 所有具有2.11版的Scala版本都应该兼容。

Update 更新资料

A minimal sbt file could look like 最小的sbt文件可能看起来像

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"

As @Till Rohrmann suggested you there's no such thing as spark-core_2.11.7 and your build.sbt appears to reference that library. 正如@Till Rohrmann所建议的那样,不存在spark-core_2.11.7并且build.sbt似乎引用了该库。

I suggest you to edit the file /home/beyhan/sparksample/build.sbt and remove the references to that library. 我建议您编辑文件/home/beyhan/sparksample/build.sbt并删除对该库的引用。

The correct reference is: 正确的参考是:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"

Remember that not only spark-core does not have any version 2.11.7 but also other spark libraries that you might be using. 请记住,不仅spark-core没有任何版本2.11.7而且还没有其他正在使用的spark库。

[info]正在更新{file:/ home / beyhan / sparksample /} default-f390c8 ... [info]正在解决org.scala-lang#scala-library; 2.11.7 ... [info]正在解决org.apache.spark #spark-core_2.11.7; 1.2.0 ... [警告]找不到模块:org.apache.spark#spark-core_2.11.7; 1.2.0 [警告] ====本地:尝试过[警告] / home /beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [警告] ====公共:尝试过[警告]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM