简体   繁体   English

将文件作为命令行参数传递给Spark

[英]Pass file as command line argument to Spark

I am coding a Spark job in Scala and need to send some argument through command-line in JSON file format like the application name, master and some more variables. 我在Scala编写Spark作业,需要通过命令行以JSON文件格式发送一些参数,如应用程序名称,master和一些更多变量。

./bin/spark-submit --name "My app" --master local[4] --conf spark.eventLog.enabled=false --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" myApp.jar

I need to send app name, master and all arguments in one JSON file like: 我需要在一个JSON文件中发送app name,master和所有参数,如:

$SPARK_HOME/bin/spark-submit --properties-file  property.conf

Is that possible? 那可能吗? How? 怎么样? Can anyone please explain with a simple example? 有人可以用一个简单的例子解释一下吗?

You can use the --jars option as follows: 您可以使用--jars选项,如下所示:

$SPARK_HOME/bin/spark-submit --jars property.conf --class your.Class your.jar

The help page of spark-submit will yell you more: spark-submit的帮助页面会让你大喊大叫:

$SPARK_HOME/bin/spark-submit --help

  --jars JARS Comma-separated list of local jars to include on the driver
              and executor classpaths.

Despite the name, you can also use it to move around configuration files that you want to be in your driver and executors' classpath. 尽管名称如此,您还可以使用它来移动您希望在驱动程序和执行程序的类路径中的配置文件。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM