简体   繁体   中英

SBT package subproject in subproject

I'm using sbt 0.13.12, and this is my project root |--- common |--- sub1 |--- sub2

In build.sbt file I set sub1 depends on common . It's OK if I run by sbt project sub1 run . But, when I package subprojects as jar files, I run the sub1.jar file, the error shown that sub1 could not find a class of common .

My purpose is packaging sub1.jar and sub2.jar with common codes are compiled in each jar files.

-- UPDATE --
I tried as suggestions in answer . It got this issue when run sbt assembly :

[error] (utils/*:assembly) deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Inject.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Inject.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Named.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Named.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Provider.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Provider.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Qualifier.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Qualifier.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Scope.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Scope.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\javax.inject\\javax.inject\\jars\\javax.inject-1.jar:javax/inject/Singleton.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\javax.inject\\jars\\javax.inject-2.4.0-b34.jar:javax/inject/Singleton.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\aopalliance\\aopalliance\\jars\\aopalliance-1.0.jar:org/aopalliance/aop/Advice.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\aopalliance-repackaged\\jars\\aopalliance-repackaged-2.4.0-b34.jar:org/aopalliance/aop/Advice.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\aopalliance\\aopalliance\\jars\\aopalliance-1.0.jar:org/aopalliance/aop/AspectException.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.glassfish.hk2.external\\aopalliance-repackaged\\jars\\aopalliance-repackaged-2.4.0-b34.jar:org/aopalliance/aop/AspectException.class .... (truncated because it's too long) .... [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-common\\jars\\hadoop-yarn-common-2.2.0.jar:org/apache/hadoop/yarn/factories/package-info.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-api\\jars\\hadoop-yarn-api-2.2.0.jar:org/apache/hadoop/yarn/factories/package-info.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-common\\jars\\hadoop-yarn-common-2.2.0.jar:org/apache/hadoop/yarn/factory/providers/package-info.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-api\\jars\\hadoop-yarn-api-2.2.0.jar:org/apache/hadoop/yarn/factory/providers/package-info.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-common\\jars\\hadoop-yarn-common-2.2.0.jar:org/apache/hadoop/yarn/util/package-info.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.hadoop\\hadoop-yarn-api\\jars\\hadoop-yarn-api-2.2.0.jar:org/apache/hadoop/yarn/util/package-info.class [error] deduplicate: different file contents found in the following: [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-core_2.11\\jars\\spark-core_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-launcher_2.11\\jars\\spark-launcher_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-tags_2.11\\jars\\spark-tags_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.spark-project.spark\\unused\\jars\\unused-1.0.0.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-network-common_2.11\\jars\\spark-network-common_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-network-shuffle_2.11\\jars\\spark-network-shuffle_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-unsafe_2.11\\jars\\spark-unsafe_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-sql_2.11\\jars\\spark-sql_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-sketch_2.11\\jars\\spark-sketch_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] C:\\Users\\WindowsUser\\.ivy2\\cache\\org.apache.spark\\spark-catalyst_2.11\\jars\\spark-catalyst_2.11-2.0.1.jar:org/apache/spark/unused/UnusedStubClass.class [error] Total time: 670 s, completed Oct 20, 2016 10:36:31 AM

And, Yes! I searched and followed these solutions, but cannot resolve the new issue:

Using this plugin https://github.com/sbt/sbt-assembly you can create fat jars, that contains all your dependencies.

In this case, you can type:

sbt
project sub1
assembly

EDIT

Regarding the deduplicate error, you have to use a merge strategy for your dependencies. I have been using the following approach that solved 99% of my problems:

lazy val mergeStrategy = Seq(
  assemblyMergeStrategy in assembly := {
    case PathList("javax", "servlet", xs@_*) => MergeStrategy.first
    case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.last
    case PathList(ps@_*) if ps.last endsWith ".html" => MergeStrategy.first
    case "application.conf" => MergeStrategy.concat
    case "reference.conf" => MergeStrategy.concat
    case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
    case m if m.toLowerCase.matches("meta-inf.*\\.sf$") => MergeStrategy.discard
    case _ => MergeStrategy.first
  }
)

lazy val root = Project(
  id = "root",
  base = file(".")
) aggregate(common, database, server)

lazy val server = (project in file("server"))
  .settings(cdbServerConfig: _*)
  .settings(mergeStrategy: _*)
  .settings(libraryDependencies ++= (commonDependencies ++ akkaDependencies ++ jsonDependencies))
  .dependsOn(database)

lazy val database = (project in file("database"))
  .settings(commonSettings: _*)
  .settings(libraryDependencies ++= (commonDependencies ++ databaseDependencies))
  .dependsOn(common)

lazy val common = (project in file("common"))
  .settings(commonSettings: _*)
  .settings(libraryDependencies ++= commonDependencies)

So in my case, I want to package the server project, that depends on the database project which depends on the common project.

Using that merge strategy works fine to me.

But as I said, in 99% it works, but when it does not, I have been using the following plugin:

https://github.com/jrudolph/sbt-dependency-graph

So I can see the dependency tree and see what is conflicting.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM