简体   繁体   English

如何设置Spark执行器内存?

[英]How to set Spark executor memory?

I have set spark.executor.memory to 2048m , and in the UI "Environment" page, I can see this value has been set correctly. 我已将spark.executor.memory设置为2048m ,并在UI“环境”页面中,我可以看到此值已正确设置。 But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB . 但是在“Executors”页面中,我看到只有1个执行程序,其内存为265.4MB Very strange value. 非常奇怪的价值。 why not 256MB, or just as what I set? 为什么不是256MB,或者就像我设置的一样?

What am I missing here? 我在这里错过了什么?

The "Executors" tab on the UI also includes the driver in the list. UI上的“执行者”选项卡还包括列表中的驱动程序。 Its "executor ID" is listed as <driver> . 其“执行者ID”列为<driver> This process is not started by Spark, so it is not affected by spark.executor.memory . 此过程不是由Spark启动的,因此不受spark.executor.memory影响。

  • If you start the driver with spark-submit , its maximal memory can be controlled by spark.driver.memory or --driver-memory 如果你用spark-submit启动驱动程序,它的最大内存可以由spark.driver.memory--driver-memory
  • If you start it as a plain old Java program, use the usual -Xmx Java flag. 如果以普通的旧Java程序启动它,请使用通常的-Xmx Java标志。

Please see the following question for the 265.4MB memory size... 有关265.4MB内存大小,请参阅以下问题...

How to set Apache Spark Executor memory 如何设置Apache Spark Executor内存

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM