简体   繁体   中英

Can Spark executor be enabled for multithreading more than CPU cores?

I understand if executor-cores is set to more than 1, then the executor will run in parallel. However, from my experience, the number of parallel processes in the executor is always equal to the number of CPUs in the executor.

For example, suppose I have a machine with 48 cores and set executor-cores to 4, and then there will be 12 executors.

What we need is to run 8 threads or more for each executor (so 2 or more threads per CPU). The reason is that the task is quite light weight and CPU usage is quite low around 10%, so we want to boost CPU usage through multiple threads per CPU.

So asking if we could possibly achieve this in the Spark configuration. Thanks a lot!

Spark executors are processing tasks, which are derived from the execution plan/code and partitions of the dataframe. Each core on an executor is always processing only one task, so each executor only get the number of tasks at most the amount of cores. Having more tasks in one executor as you are asking for is not possible. You should look for code changes, minimize amount of shuffles (no inner joins; use windows instead) and check out for skew in your data leading to non-uniformly distributed partition sizing (dataframe partitions, not storage partitions).

WARNING: If you are however alone on your cluster and you do not want to change your code, you can change the YARN settings for the server and represent it with more than 48 cores, even though there are just 48. This can lead to severe instability of the system, since executors are now sharing CPUs. (And your OS also needs CPU power.)

This answer is meant as a complement to @Telijas' answer, because in general I agree with it. It's just to give that tiny bit of extra information.

There are some configuration parameters in which you can set the number of thread for certain parts of Spark. There is, for example, a section in the Spark docs that discusses some of them (for all of this I'm looking at the latest Spark version at the time of writing this post: version 3.3.1 ):

Depending on jobs and cluster configurations, we can set number of threads in several places in Spark to utilize available resources efficiently to get better performance. Prior to Spark 3.0, these thread configurations apply to all roles of Spark, such as driver, executor, worker and master. From Spark 3.0, we can configure threads in finer granularity starting from driver and executor. Take RPC module as example in below table. For other modules, like shuffle, just replace “rpc” with “shuffle” in the property names except spark.{driver|executor}.rpc.netty.dispatcher.numThreads, which is only for RPC module.

Property Name Default Meaning Since Version
spark.{driver executor}.rpc.io.serverThreads Fall back on spark.rpc.io.serverThreads Number of threads used in the server thread pool
spark.{driver executor}.rpc.io.clientThreads Fall back on spark.rpc.io.clientThreads Number of threads used in the client thread pool
spark.{driver executor}.rpc.netty.dispatcher.numThreads Fall back on spark.rpc.netty.dispatcher.numThreads Number of threads used in RPC message dispatcher thread pool

Then here follows a (non-exhaustive in no particular order, just been looking through the source code) list of some other number-of-thread-related configuration parameters:

  • spark.sql.streaming.fileSource.cleaner.numThreads
  • spark.storage.decommission.shuffleBlocks.maxThreads
  • spark.shuffle.mapOutput.dispatcher.numThreads
  • spark.shuffle.push.numPushThreads
  • spark.shuffle.push.merge.finalizeThreads
  • spark.rpc.connect.threads
  • spark.rpc.io.threads
  • spark.rpc.netty.dispatcher.numThreads (will be overridden by the driver/executor-specific ones from the table above)
  • spark.resultGetter.threads
  • spark.files.io.threads

I didn't add the meaning of these parameters to this answer because that's a different question and quite "Googleable". This is just meant as an extra bit of info.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM