简体   繁体   English

Spark-Scala build.sbt libraryDependencies未解决的依赖关系

[英]Spark-Scala build.sbt libraryDependencies UnresolvedDependency

i'm trying to import a dependency in my build.sbt file from here https://github.com/dmarcous/spark-betweenness . 我试图从这里https://github.com/dmarcous/spark-betweenness导入我的build.sbt文件中的依赖项。

When i hover on the error it says: 当我将鼠标悬停在错误上时,它说:

Expression type ModuleID must confirm to Def.SettingsDefinition in SBT file 表达式类型ModuleID必须确认为SBT文件中的Def.SettingsDefinition
Unresolved Dependency 未解决的依赖性

I am new in scala so my question may be silly.Thanks in advance 我是Scala的新手,所以我的问题可能很愚蠢。

It is still unclear how your build configuration looks like, but the following build.sbt works (in the sense that it compiles and does not show the error that you mentioned): 尚不清楚您的构建配置的外观,但是以下build.sbt可以工作(就其编译而言,并不显示您提到的错误):

name := "test-sbt"

organization := "whatever"

version := "1.0.0"

scalaVersion := "2.10.7"

libraryDependencies += "com.centrality" %% "spark-betweenness" % "1.0.0"

Alternatively, if you have a multi-project build, it could look like this: 另外,如果您有一个多项目构建,它可能看起来像这样:

lazy val root = project
  .settings(
    name := "test-sbt",
    organization := "whatever",
    version := "1.0.0",
    scalaVersion := "2.10.7",
    libraryDependencies += "com.centrality" %% "spark-betweenness" % "1.0.0"
  )

However, you're probably going to find that it still does not work because it cannot resolve this dependency. 但是,您可能会发现它仍然无法工作,因为它无法解决这种依赖性。 Indeed, this library does not seem to be available neither in Maven Central nor in jcenter. 实际上,该库似乎在Maven Central和jcenter中均不可用。 It is also very old - it appears to only be published for Scala 2.10 and a very old Spark version (1.5), so most likely you won't be able to use it with recent Spark environments (2.x and Scala 2.11). 它也很旧-它似乎仅在Scala 2.10和非常旧的Spark版本(1.5)中发布,因此很可能您将无法在最新的Spark环境(2.x和Scala 2.11)中使用它。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM