简体   繁体   中英

Apache Spark number of executors

I have Spark app on databricks, running it on cluster of 32 nodes, 16 cores each and 30gb memory. I wanted to change some session configurations but no matter what i change, I cannot make more executors then 32(as seen on executors page spark ui)? These configs i've changed:

spark.executor.instances
spark.executor.memory 
spark.executor.cores

As i read, max number of concurrent tasks should be 5, so I wanted to make 4 executors per node, each using 4 cores...total number of executors - 128. How can i do that?

Kind regards, Stefan

For the Spark build with the latest version, we can set the parameters: --executor-cores and --total-executor-cores . the total executor would be total-executor-cores/executor-cores

Try this one:

spark-submit  --executor-memory 4g --executor-cores 4 --total-executor-cores 512

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM