简体   繁体   中英

yarn - spark parallel job

I made yarn-cluster which has only 1 work node, and it seems to work fine when I submit my spark application job. When I submit job more than one, jobs are on hadoop queue and process submitted application one by one. I want to process my applications parallelly, not one by one. Is there any configuration for this? or unable to do this on yarn?

Yarn submits jobs one by one by default. For submit multiple jobs you can change amount of your executor cores:

spark-submit class /jar --executor-memory 2g --num-executors 15 --executor-cores 3 --master yarn --deploy-mode cluster 

You also can change this properties in your yarn-site.xml

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM