简体   繁体   中英

dse spark-submit : core not used when submitting a job (Running application “Cores = 0” with 3 workers available with 4 cores each)

I'm sending a job to my spark dse cluster composed of 3 workers with 4 cores each.

The job is correctly sent to my cluster, however Cores remain to zero and the job is waiting for ressources. The 3 workes are idle with 4 cores each. I don't understand why they don't get involved.

See spark ui screenshot here

Job is sent using : sudo -u cassandra dse spark-submit --master spark://XX.XX.XX.XXX:7077 --executor-cores=4 --total-executor-cores=4 --executor-memory=16g --class com.MyClass /home/spark.jar

and gives me the following messages : Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Nothing special in spark logs. Any ideas ?

You send the job with --executor-memory=16g but only have 8g per worker.

Try to run it with --executor-memory=8g and you should be fine

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM