繁体   English   中英

提交作业时,spark-submit 中使用的参数是否有特定顺序?

[英]Is there a specific order of parameters used in spark-submit while submitting a job?

我正在尝试使用 spark-submit 提交一个 spark 作业,如下所示:

> SPARK_MAJOR_VERSION=2  spark-submit --conf spark.ui.port=4090
> --driver-class-path /home/devusr/jars/greenplum-spark_2.11-1.3.0.jar  --jars /home/devusr/jars/greenplum-spark_2.11-1.3.0.jar --executor-cores 3 --executor-memory 13G --class com.partition.source.YearPartition splinter_2.11-0.1.jar --master=yarn
> --keytab /home/devusr/devusr.keytab --principal devusr@DEV.COM --files /usr/hdp/current/spark2-client/conf/hive-site.xml,testconnection.properties
> --name Splinter --conf spark.executor.extraClassPath=/home/devusr/jars/greenplum-spark_2.11-1.3.0.jar
> --conf spark.executor.instances=10 --conf spark.dynamicAllocation.enabled=false  --conf
> spark.files.maxPartitionBytes=256M

但该作业不会运行,而只是打印:

SPARK_MAJOR_VERSION is set to 2, using Spark2 

任何人都可以让我知道 spark-submit 中使用的参数是否有任何特定顺序?

yarn上以cluster模式使用spark-submit的格式是$ ./bin/spark-submit --class path.to.your.Class --master yarn --deploy-mode cluster [options] <app jar> [app options]https://spark.apache.org/docs/2.1.0/running-on-yarn.html中所述

如果splinter_2.11-0.1.jar是包含类的JAR com.partition.source.YearPartition ,你可以尝试使用这样的:

spark-submit \
        --class com.partition.source.YearPartition                                              \
        --master=yarn                                                                           \
        --conf spark.ui.port=4090                                                               \
        --driver-class-path /home/devusr/jars/greenplum-spark_2.11-1.3.0.jar                    \
        --jars /home/devusr/jars/greenplum-spark_2.11-1.3.0.jar                                 \
        --executor-cores 3                                                                      \
        --executor-memory 13G                                                                   \
        --keytab /home/devusr/devusr.keytab                                                     \
        --principal devusr@DEV.COM                                                              \
        --files /usr/hdp/current/spark2-client/conf/hive-site.xml,testconnection.properties     \
        --name Splinter                                                                         \
        --conf spark.executor.extraClassPath=/home/devusr/jars/greenplum-spark_2.11-1.3.0.jar   \
        --conf spark.executor.instances=10                                                      \
        --conf spark.dynamicAllocation.enabled=false                                            \
        --conf spark.files.maxPartitionBytes=256M                                               \
        splinter_2.11-0.1.jar

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM