简体   繁体   English

Scala SBT elasticsearch-hadoop 未解决的依赖

[英]Scala SBT elasticsearch-hadoop unresolved dependency

When adding dependency libraryDependencies += "org.elasticsearch" % "elasticsearch-hadoop" % "5.1.1" and refreshing project, I get many unresolved dependencies(cascading, org.pentaho,...).添加依赖libraryDependencies += "org.elasticsearch" % "elasticsearch-hadoop" % "5.1.1"并刷新项目时,我得到了许多未解决的依赖项(级联,org.pentaho,...)。

However if I add another dependency, like libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" it works and I can use the library in my scala files.但是,如果我添加另一个依赖项,例如libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"它可以工作,我可以在我的 scala 文件中使用该库。

So, is the problem coming from elasticsearch-hadoop ?那么,问题是来自 elasticsearch-hadoop 吗? I'm using SBT 0.13.13 but also tried with 0.13.8.我正在使用 SBT 0.13.13,但也尝试使用 0.13.8。

I took the dependency from https://mvnrepository.com/artifact/org.elasticsearch/elasticsearch-hadoop/5.1.1 I know that for some dependencies you need to add the repository aswell (resolvers += ...), but here it doesn't seems to need a repo.我从https://mvnrepository.com/artifact/org.elasticsearch/elasticsearch-hadoop/5.1.1获取依赖项我知道对于某些依赖项,您还需要添加存储库(解析器 += ...),但在这里它似乎不需要回购。

build.sbt文件中添加以下内容:

resolvers += "conjars.org" at "http://conjars.org/repo"

Can update your .sbt file可以更新您的 .sbt 文件

name:="HelloSparkApp"
version:="1.0"
scalaVersion:="2.10.4"
libraryDependencies+="org.apache.spark"%%"spark-core"%"1.5.2"

And execute the below commands from the project directory并从项目目录执行以下命令

sbt clean
sbt package
sbt eclipse

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM