简体   繁体   中英

How to set Spark executor memory?

I have set spark.executor.memory to 2048m , and in the UI "Environment" page, I can see this value has been set correctly. But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB . Very strange value. why not 256MB, or just as what I set?

What am I missing here?

The "Executors" tab on the UI also includes the driver in the list. Its "executor ID" is listed as <driver> . This process is not started by Spark, so it is not affected by spark.executor.memory .

  • If you start the driver with spark-submit , its maximal memory can be controlled by spark.driver.memory or --driver-memory
  • If you start it as a plain old Java program, use the usual -Xmx Java flag.

Please see the following question for the 265.4MB memory size...

How to set Apache Spark Executor memory

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM