简体   繁体   中英

How to increase Java heap space on Spark Amazon EC2 cluster?

I wrote a program using Java Spark API. Given my data is big, I am getting the following error

java.lang.OutOfMemoryError: Java heap space

Any idea how to increase java heap space of Spark EC2 cluster on Aws? I can give details about my code and cluster setup if necessary. Thanks.

I was able to increase the Heap space by adding two flags while submitting the application jar to spark-submit .

--executor-memory 10g --driver-memory 2g

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM