简体   繁体   中英

Change minimum heap size of executor in spark on yarn

I want to change the initial/minimum heap size of my executors while running spark on yarn. Right now it throws the following exception,

java.lang.Exception: spark.executor.extraJavaOptions is not allowed to alter memory settings

I am using the following --conf "spark.executor.extraJavaOptions=-Xms4096m" while running my spark-shell.

I am using spark 1.6.0. Greatly appreciate the help!

A bit about spark.executor.extraJavaOptions from the docs

Note that it is illegal to set Spark properties or heap size settings with this option. Spark properties should be set using a SparkConf object or the spark-defaults.conf file used with the spark-submit script. Heap size settings can be set with spark.executor.memory.

Try this --conf "spark.executor.memory=4g"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM