[英]How to pass the command line argument to spark-shell scala script?
the problem is similar to Passing command line arguments to Spark-shell . 问题类似于将命令行参数传递给Spark-shell 。 However, I didn't get the answer I want, so I rephrase my problem below.
但是,我没有得到我想要的答案,所以我在下面重述我的问题。
I want to run a scala script in spark-shell with following commands: 我想使用以下命令在spark-shell中运行scala脚本:
spark-shell -i file.scale
It works well without any program argument. 它没有任何程序参数,效果很好。 however, if I want to add in some command line argument for the file.scale.
但是,如果我想为file.scale添加一些命令行参数。 referencing to the way it does in scala shell http://alvinalexander.com/scala/scala-shell-script-command-line-arguments-args , I tried with following way:
引用它在scala shell中的方式http://alvinalexander.com/scala/scala-shell-script-command-line-arguments-args ,我尝试了以下方式:
spark-shell -i file.scale args1 args2
and I tried to retrieve the arguments but failed like followings 我试图检索参数,但失败如下
var input = args(0)
var output = args(1)
The error message shows parameter args
is not recognisable. 错误消息显示参数
args
无法识别。
Any one knows how to do that? 谁知道怎么做?
There is difference between Scala and Spark-shell. Scala和Spark-shell之间存在差异。 Spark-shell is wrapper to scala and runs in distributed mode, hence parameter passing does not work in the same manner as scala.
Spark-shell是scala的包装器并以分布式模式运行,因此参数传递与scala的工作方式不同。
To run scala application, here are the steps you need to follow 要运行scala应用程序,请执行以下步骤
As the application has to be run in distributed mode in Spark, you will not be able to pass the parameters to scala scripts using spark-shell. 由于应用程序必须在Spark中以分布式模式运行,因此您无法使用spark-shell将参数传递给scala脚本。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.