简体   繁体   中英

Spark UI Showing Wrong Memory Allocation

We are currently running into an issue where Spark is showing that each of our nodes only have 4GB of memory. However, we have allocated 10GB of memory by setting spark-worker.jvmOptions = -Xmx10g . We can not figure out what is causing this unusual limitation/incorrect memory allocation.

When we go to run spark jobs it will run as if there is only 4GB of memory per worker.

Any help would be great! Thanks!

Screenshot of SOLR UI

You should set worker memory using: --executor-memory in your spark-submit

Try setting the following parameters inside the conf/spark-defaults.conf file:

spark.executor.memory           10g

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM