简体   繁体   中英

heap size usage using “javaw.exe” vs “java.exe”

Below is the program,

public class Dummy {

    public static void main(String[] args) throws Exception {
        final int LENGTH = Integer.MAX_VALUE / 8;
        Object[] values = new Object[LENGTH];
        int count = 0;
        for (int i = 0; i < Integer.MAX_VALUE; i++) {
            Object o = new Object();
            int hashCode = o.hashCode();
            if (hashCode > LENGTH)
                continue;
            if (values[hashCode] != null) {
                System.out.println("found after " + count + ": " + values[hashCode] + " same hashcode as " + o);
                System.out.println(values[hashCode] == o);
                System.exit(0);
            } else {
                System.out.println(hashCode);
                values[hashCode] = o;
                count++;
            }
        }
    }
}

when launched via eclipse(thru 64 bit javaw.exe ) has the heap usage that goes upto below shown approximate value(max) consistently and battery goes down in minutes,

在此输入图像描述

and then shows the below exception:

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded

On same machine, the same program when launched using 64-bit java.exe from command line, new hashcode clashes with previous hashcode consistently after creating 22985 objects in for-loop with "private working set" value of 1GB(max).

:
23206321
39915397
found after 22985: java.lang.Object@f2eb847 same hashcode as java.lang.Object@f2eb847
false

Without concentrating much on code logic, I would like to understand,

1) Why the difference in heap usage comparing both approaches? Because there is no tuning done for either approach.

2) How do i control heap usage parameters before starting the program either via eclipse( javaw.exe ) or via command line( java.exe )? Please help me!!!

Note: am working with java 1.6

if you don't specify JVM uses Ergonomics (a decision to set default values) based on host architecture and it sets various default param for JVM, heap is one of them

for 64bit CPU JVM sets higher value of heap and so you see delay in OOM

You can verify this by invoking

java -XX:+PrintFlagsFinal -version 2>&1 | grep MaxHeapSize

Since you are using windows you can either use some JDK tools or you can use this program to verify memory default tuning

long maxBytes = Runtime.getRuntime().maxMemory();
System.out.println("Max memory: " + maxBytes / 1024 / 1024 + "M");

in both of the machine

you can also override the heap size by explicitly specifying one in that case you should see similar behavior with perspective of memory

Why the difference in heap usage comparing both approaches? Because there is no tuning done for either approach.

Your test is based on a random number generation ie the hashCode. This means it will run for a random amount of time before stopping. If you print of the hashCodes, you will see they are 31-bit (none are negative) even on a 64-bit machine and they are randomly arranged. ie they have nothing to do with address locations, nor should they are they cannot change regardless of where the object is in memory.

How do i control heap usage parameters before starting the program either via eclipse(javaw.exe) or via command line(java.exe)?

You can control the memory started in eclipse by changing the command line arguments setting in eclipse. On the command line by specifying the amount of memory you want.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM