简体   繁体   English

子项目中的SBT包子项目

[英]SBT package subproject in subproject

I'm using sbt 0.13.12, and this is my project root |--- common |--- sub1 |--- sub2 我正在使用sbt 0.13.12,这是我的项目root |--- common |--- sub1 |--- sub2

In build.sbt file I set sub1 depends on common . build.sbt文件中,我将sub1设置为common It's OK if I run by sbt project sub1 run . 如果我通过sbt project sub1 run But, when I package subprojects as jar files, I run the sub1.jar file, the error shown that sub1 could not find a class of common . 但是,当我将子项目打包为jar文件时,我运行sub1.jar文件,该错误显示sub1找不到common类。

My purpose is packaging sub1.jar and sub2.jar with common codes are compiled in each jar files. 我的目的是包装sub1.jarsub2.jar ,并在每个jar文件中编译common代码。

-- UPDATE -- -更新-
I tried as suggestions in answer . 我尝试作为建议来回答 It got this issue when run sbt assembly : 运行sbt assembly时遇到了这个问题:

[error] (utils/*:assembly) deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Inject.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Inject.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Named.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Named.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Provider.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Provider.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Qualifier.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Qualifier.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Scope.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Scope.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Singleton.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Singleton.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\aopalliance\\aopalliance\\jars\\aopalliance-1.0.jar:org/aopalliance/aop/Advice.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\aopalliance-repackaged\\jars\\aopalliance-repackaged-2.4.0-b34.jar:org/aopalliance/aop/Advice.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\aopalliance\\aopalliance\\jars\\aopalliance-1.0.jar:org/aopalliance/aop/AspectException.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\aopalliance-repackaged\\jars\\aopalliance-repackaged-2.4.0-b34.jar:org/aopalliance/aop/AspectException.class .... (truncated because it's too long) .... [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-common\\jars\\hadoop-yarn-common-2.2.0.jar:org/apache/hadoop/yarn/factories/package-info.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-api\\jars\\hadoop-yarn-api-2.2.0.jar:org/apache/hadoop/yarn/factories/package-info.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-common\\jars\\hadoop-yarn-common-2.2.0.jar:org/apache/hadoop/yarn/factory/providers/package-info.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-api\\jars\\hadoop-yarn-api-2.2.0.jar:org/apache/hadoop/yarn/factory/providers/package-info.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-common\\jars\\hadoop-yarn-common-2.2.0.jar:org/apache/hadoop/yarn/util/package-info.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-api\\jars\\hadoop-yarn-api-2.2.0.jar:org/apache/hadoop/yarn/util/package-info.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-core_2.11\\jars\\spark-core_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-launcher_2.11\\jars\\spark-launcher_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-tags_2.11\\jars\\spark-tags_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.spark-project.spark\\unused\\jars\\unused-1.0.0.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-network-common_2.11\\jars\\spark-network-common_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-network-shuffle_2.11\\jars\\spark-network-shuffle_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-unsafe_2.11\\jars\\spark-unsafe_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-sql_2.11\\jars\\spark-sql_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-sketch_2.11\\jars\\spark-sketch_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-catalyst_2.11\\jars\\spark-catalyst_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] Total time: 670 s, completed Oct 20, 2016 10:36:31 AM

And, Yes! 是的! I searched and followed these solutions, but cannot resolve the new issue: 我搜索并遵循了以下解决方案,但无法解决新问题:

Using this plugin https://github.com/sbt/sbt-assembly you can create fat jars, that contains all your dependencies. 使用此插件https://github.com/sbt/sbt-assembly可以创建包含所有依赖项的胖罐。

In this case, you can type: 在这种情况下,您可以输入:

sbt
project sub1
assembly

EDIT 编辑

Regarding the deduplicate error, you have to use a merge strategy for your dependencies. 关于重复数据删除错误,您必须为依赖项使用合并策略。 I have been using the following approach that solved 99% of my problems: 我一直在使用以下方法解决了我99%的问题:

lazy val mergeStrategy = Seq(
  assemblyMergeStrategy in assembly := {
    case PathList("javax", "servlet", xs@_*) => MergeStrategy.first
    case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.last
    case PathList(ps@_*) if ps.last endsWith ".html" => MergeStrategy.first
    case "application.conf" => MergeStrategy.concat
    case "reference.conf" => MergeStrategy.concat
    case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
    case m if m.toLowerCase.matches("meta-inf.*\\.sf$") => MergeStrategy.discard
    case _ => MergeStrategy.first
  }
)

lazy val root = Project(
  id = "root",
  base = file(".")
) aggregate(common, database, server)

lazy val server = (project in file("server"))
  .settings(cdbServerConfig: _*)
  .settings(mergeStrategy: _*)
  .settings(libraryDependencies ++= (commonDependencies ++ akkaDependencies ++ jsonDependencies))
  .dependsOn(database)

lazy val database = (project in file("database"))
  .settings(commonSettings: _*)
  .settings(libraryDependencies ++= (commonDependencies ++ databaseDependencies))
  .dependsOn(common)

lazy val common = (project in file("common"))
  .settings(commonSettings: _*)
  .settings(libraryDependencies ++= commonDependencies)

So in my case, I want to package the server project, that depends on the database project which depends on the common project. 因此,就我而言,我想打包服务器项目,该项目取决于数据库项目,而数据库项目又取决于公共项目。

Using that merge strategy works fine to me. 使用该合并策略对我来说很好。

But as I said, in 99% it works, but when it does not, I have been using the following plugin: 但是正如我所说,在99%的情况下它可以工作,但是当它不起作用时,我一直在使用以下插件:

https://github.com/jrudolph/sbt-dependency-graph https://github.com/jrudolph/sbt-dependency-graph

So I can see the dependency tree and see what is conflicting. 这样我就可以看到依赖关系树,看看有什么冲突。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM