简体   繁体   中英

Does h2o cloud require a large amount of memory?

I'm trying to set up a H2O cloud on a 4 data nodes hadoop spark cluster using R in a Zeppelin notebook. I found that I have to give each executor at least 20Gb of memory before my R paragraph stops complaining about running out of memory (java error message of GC out of memory).

Is it expected that I need 20Gb of memory per executor for running an H2O cloud? Or are there any configuration entries that I can change to reduce the memory requirement?

There isn't enough information in this post to give specifics. But I will say that the presence of Java GC messages is not necessarily a problem, especially at startup. It's normal to see a flurry of GC messages at the beginning of a Java program's life as the heap expands from nothing to it's steady-state working size.

A sign that Java GC really is becoming a major problem is when you see back-to-back full GC cycles that have a real wall-clock time of seconds or more.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM