[英]Using Scala script in Spark
We are planning to using Scala
on Spark
to make computations. 我们计划在
Spark
上使用Scala
进行计算。 Just want to know what is the best way to execute Scala
in Spark
; 只想知道在
Spark
执行Scala
的最佳方法是什么? Scala as Script
(or) Scala as Application
. Scala as Script
(或Scala as Application
。 Is there any advantage / disadvantage between these 2 methods? 这两种方法之间有什么优点 / 缺点 ?
As mentioned here it is possible to execute Scala as Script
. 如前所述这里就可以执行
Scala as Script
。 I am trying to skip the compilation process using sbt
so that I can use Scala
as script just like we will use Python
我试图跳过使用
sbt
的编译过程,以便可以像使用Python
一样将Scala
用作脚本
I suppose you mean by scala as script the scala REPL comes with spark (spark-shell) and scala application is the standlaone appliction packaged by sbt
or maven
. 我想您所说的scala是脚本,scala REPL带有spark(spark-shell),而scala应用程序是
sbt
或maven
打包的标准应用程序。
spark-shell
) to test your algoriothm/implementation . spark-shell
)测试您的算法/实现。 So it should be used as staging phase. spark-submit
spark-submit
Hope this is clear enough 希望这足够清楚
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.