I can see the parallelism of my job in the Spark Details page, but I wonder how many Executors my job is actually running with.
Where can I see this?
If you follow the same methodology to find the Environment
tab noted over here , you'll find an entry on that page for the number of executors used.
Depending on your environment, you may find that dynamicAllocation
is true
, in which case you'll have a minExecutors
and a maxExecutors
setting noted, which is used as the 'bounds' of your job. You'll have any number of executors in that range depending on if you're using static allocation or dynamic allocation .
If dynamicAllocation
is false
, then you'll see an executorInstances
entry, with the count listed there.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.