简体   繁体   中英

How to check the number of cores Spark uses?

I have spark.cores.max set to 24 [3 worker nodes], but If I get inside my worker node and see there is just one process [command = Java] running that consumes memory and CPU. I suspect it does not use all 8 cores (on m2.4x large ).

How to know the number?

You can see the number of cores occupied on each worker in the cluster under the Spark Web UI: Spark Web UI

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM