[英]Unresolved dependencies path for SBT project in IntelliJ
I'm using IntelliJ to develop Spark application. 我正在使用IntelliJ开发Spark应用程序。 I'm following this instruction on how to make intellij work nicely with SBT project. 我正在按照这个说明如何使intellij与SBT项目很好地协同工作。
As my whole team is using IntelliJ so we can just modify build.sbt but we got this unresolved dependencies error 由于我的整个团队都在使用IntelliJ,所以我们可以修改build.sbt,但是我们得到了这个未解决的依赖项错误
Error:Error while importing SBT project: 错误:导入SBT项目时出错:
[info] Resolving org.apache.thrift#libfb303;0.9.2 ... [info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ... [info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ... [info] Resolving org.apache.spark#spark-parent_2.10;2.1.0 ... [info] Resolving org.scala-lang#jline;2.10.6 ... [info] Resolving org.fusesource.jansi#jansi;1.4 ... [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] [warn] Note: Unresolved dependencies path: [warn] sparrow-to-orc:sparrow-to-orc_2.10:0.1 [warn] +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT [trace] Stack trace suppressed: run 'last mainRunner/:ssExtractDependencies' for the full output. [trace] Stack trace suppressed: run 'last mainRunner/:update' for the full output. [error] (mainRunner/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [error] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [error] Total time: 47 s, completed Jun 10, 2017 8:39:57 AM
And this is my build.sbt 这是我的build.sbt
name := "sparrow-to-orc"
version := "0.1"
scalaVersion := "2.11.8"
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0",
"org.apache.spark" %% "spark-hive" % "2.1.0",
"org.apache.spark" %% "spark-streaming" % "2.1.0"
)
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"
libraryDependencies ++= sparkDependencies.map(_ % "provided")
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
assemblyMergeStrategy in assembly := {
case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
case PathList("org", "apache", xs @ _*) => MergeStrategy.last
case PathList("com", "google", xs @ _*) => MergeStrategy.last
case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
case "about.html" => MergeStrategy.rename
case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
case "META-INF/mailcap" => MergeStrategy.last
case "META-INF/mimetypes.default" => MergeStrategy.last
case "plugin.properties" => MergeStrategy.last
case "log4j.properties" => MergeStrategy.last
case "overview.html" => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
If I don't have this line then the program works fine 如果我没有这条线,那么程序运行正常
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
But then i won't be able to run the application inside IntelliJ as spark dependencies won't be included in the classpath. 但是后来我将无法在IntelliJ中运行应用程序,因为类似路径中不会包含spark依赖项。
I had the same issue. 我遇到过同样的问题。 The solution is to set the Scala version in the mainRunner
to be the same as the one declared at the top of the build.sbt
file: 解决方案是将mainRunner
的Scala版本设置为与build.sbt
文件顶部声明的版本相同:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile"),
scalaVersion := "2.11.8"
)
Good luck! 祝好运!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.