简体   繁体   中英

Spark standalone cluster

I have an spark-standalone cluster. The cluster consists of 2 workers and 1 master nodes. When I run an program on master node, jobs are only assigned to one worker. Another worker can not do something. 在此处输入图片说明

Workers appears on the picture. To run my code, I have used following command:

spark-submit --class Main.Main --master spark://172.19.0.2:7077 --deploy-mode cluster Main.jar ReadText.txt  

From the above Image we notice you have 1 core system in your worker nodes

You can use the below command

spark-submit --class Main.Main --total-executor-cores 2 --executor-cores 1 --master spark://172.19.0.2:7077 --deploy-mode cluster Main.jar ReadText.txt

Hope this Helps!!!...

Can you please try once with the deploy mode client or just ignore that parameter because what is happening here if your deploy mode will be cluster, one of your worker run the driver task and the other worker will run the rdd task so thats why your one worker only execute the task and when you run your shell it was by default use the client mode and use both the workers for running tasks. Just try once below command to deploy the application and can you please once also share code snippet of your application.

spark-submit --class Main.Main --master spark://172.19.0.2:7077  Main.jar ReadText.txt   

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM