简体   繁体   中英

how to properly submit a spark job?

I have some scala / spark code, packaged into sparktest_2.10-1.0.jar file I'm trying to do spark-submit:

spark-submit --class sparktest_2.10-1.0 --master local[2]

I get: Error: Must specify a primary resource (JAR or Python or R file)

What is the proper way to do spark-submit ?

spark-submit
  --class "main-class"
  --master spark://master-url
  --deploy-mode "deploy-mode"
  --conf <key>=<value>  
  ... # other options
  application-jar
  [application-arguments]

Eg:

spark-submit --class "com.example.myapp" myapp.jar

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM