I have cluster contain 1 master and 5 slave ( node ) , each of them 32 core and 64 GB memory .
Is there any pattern to calculate following parameter in spark submission with yarn
--executor-memory --num-executors --executor-cores
If we have following hardware then calculate spark
Calculations:
Ans
Executer-memory controls the heap size:
Spark Memory Management
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.