简体   繁体   中英

How to pass the command line argument to spark-shell scala script?

the problem is similar to Passing command line arguments to Spark-shell . However, I didn't get the answer I want, so I rephrase my problem below.

I want to run a scala script in spark-shell with following commands:

spark-shell -i file.scale

It works well without any program argument. however, if I want to add in some command line argument for the file.scale. referencing to the way it does in scala shell http://alvinalexander.com/scala/scala-shell-script-command-line-arguments-args , I tried with following way:

 spark-shell -i file.scale args1 args2

and I tried to retrieve the arguments but failed like followings

 var input = args(0)
 var output = args(1)

The error message shows parameter args is not recognisable.

Any one knows how to do that?

There is difference between Scala and Spark-shell. Spark-shell is wrapper to scala and runs in distributed mode, hence parameter passing does not work in the same manner as scala.

To run scala application, here are the steps you need to follow

  • Develop Scala application
  • Build jar file
  • Run spark-submit command which takes jar file, class name along with parameters

As the application has to be run in distributed mode in Spark, you will not be able to pass the parameters to scala scripts using spark-shell.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM