[英]Conflicting files in uber-jar creation in SBT using sbt-assembly
I am trying to compile and package a fat jar using SBT and I keep running into the following error. 我正在尝试使用SBT编译和打包胖罐子,并且不断遇到以下错误。 I have tried everything from using library dependency exclude and merging. 我已经尝试过使用库依赖排除和合并的所有方法。
[trace] Stack trace suppressed: run last *:assembly for the full output.
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /Users/me/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1 .7.10.jar:META-INF/maven/org.slf4j/slf4j-api/pom.properties
[error] /Users/me/.ivy2/cache/com.twitter/parquet-format/jars/parquet-format-2.2.0-rc1.jar:META-INF/maven/org.slf4j/slf4j-api/pom.properties
[error] Total time: 113 s, completed Jul 10, 2015 1:57:21 AM
The current incarnation of my build.sbt file is below: 我的build.sbt文件的当前版本如下:
import AssemblyKeys._
assemblySettings
name := "ldaApp"
version := "0.1"
scalaVersion := "2.10.4"
mainClass := Some("myApp")
libraryDependencies +="org.scalanlp" %% "breeze" % "0.11.2"
libraryDependencies +="org.scalanlp" %% "breeze-natives" % "0.11.2"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.3.1"
libraryDependencies +="org.ini4j" % "ini4j" % "0.5.4"
jarName in assembly := "myApp"
net.virtualvoid.sbt.graph.Plugin.graphSettings
libraryDependencies += "org.slf4j" %% "slf4j-api"" % "1.7.10" % "provided"
I realize I am doing something wrong...I just have no idea what. 我意识到自己做错了...我只是不知道该怎么办。
Here is how you can handle these merge issues. 这是处理这些合并问题的方法。
import sbtassembly.Plugin._
lazy val assemblySettings = sbtassembly.Plugin.assemblySettings ++ Seq(
publishArtifact in packageScala := false, // Remove scala from the uber jar
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", "CHANGES.txt") => MergeStrategy.first
// ...
case PathList(ps @ _*) if ps.last endsWith "pom.properties" => MergeStrategy.first
case x => old(x)
}
}
)
Then add these settings to your project. 然后将这些设置添加到您的项目。
lazy val projectToJar = Project(id = "MyApp", base = file(".")).settings(assemblySettings: _*)
I got your assembly build running by removing spark from the fat jar ( mllib is already included in spark ). 我通过从胖子罐中移除火花来运行您的程序集构建( mllib已经包含在spark中 )。
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1" % "provided"
Like vitalii said in a comment, this solution was already here . 就像vitalii在评论中说的那样,此解决方案已经在这里 。 I understand that spending hours on a problem without finding the fix can be frustrating but please be nice . 我了解在问题上花费数小时而找不到解决方法可能会令人沮丧,但请保持友好 。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.