简体   繁体   English

使用sbt在Spark中构建子项目

[英]Build subproject in Spark with sbt

I want to build subproject in Spark with sbt. 我想用sbt在Spark中构建子项目。 I found this example and it works 我找到了这个例子,它的工作原理

$ ./build/sbt -Phive -Phive-thriftserver  (build)
sbt (spark)> project hive (switch to subproject)
sbt (hive)> testOnly *.HiveQuerySuite -- -t foo ( run test case)

However, I tried the following but it does not build but quit ./build/sbt -mllib 但是,我尝试了以下操作,但它没有建立但退出了./build/sbt -mllib

I do not know how does the author figure out -Phive -Phive-thriftserver . 我不知道作者如何弄清楚-Phive -Phive-thriftserver I cannot find this in Spark source code. 我在Spark源代码中找不到此。

I just want to do the exact same thing as the example but with a different subproject. 我只想做与示例完全相同的操作,但是使用不同的子项目。


This is not asking how to use projects to print out all available projects. 这不是在问如何使用projects来打印所有可用项目。

Specify the project scope: 指定项目范围:

./build/sbt mllib/compile

refer to: http://www.scala-sbt.org/0.13/docs/Scopes.html 请参阅: http : //www.scala-sbt.org/0.13/docs/Scopes.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM