简体   繁体   中英

Java VM suddenly exiting without apparent reason

I have a problem with my Java progam suddenly exiting, without any exception thrown or the program finishing normally.

I'm writing a program to solve Project Euler 's 14th problem . This is what I got:

private static final int INITIAL_CACHE_SIZE = 30000;
private static Map<Long, Integer> cache = new HashMap<Long, Integer>(INITIAL_CACHE_SIZE);

public void main(String... args) {
    long number = 0;
    int maxSize = 0;

    for (long i = 1; i <= TARGET; i++) {
        int size = size(i);
        if (size > maxSize) {
            maxSize = size;
            number = i;
        }
    }
}
private static int size(long i) {
    if (i == 1L) {
        return 1;
    }
    final int size = size(process(i)) + 1;
    return size;
}

private static long process(long n) {
    return n % 2 == 0 ? n/2 : 3*n + 1;
}

This runs fine, and finishes correctly in about 5 seconds when using a TARGET of 1 000 000.

I wanted to optimize by adding a cache, so I changed the size method to this:

private static int size(long i) {
    if (i == 1L) {
        return 1;
    }
    if (cache.containsKey(i)) {
        return cache.get(i);
    }
    final int size = size(process(i)) + 1;
    cache.put(i, size);
    return size;
}

Now when I run it, it simply stops (process exits) when I get to 555144. Same number every time. No exception, error, Java VM crash or anything is thrown.

Changing the cache size doesn't seem to have any effect either, so how could the cache introduction cause this error?

If I enforce the cache size to be not just initial, but permanent like so:

    if (i < CACHE_SIZE) {
        cache.put(i, size);
    }

the bug no longer occurs. Edit: When I set the cache size to like 2M, the bug starts showing again.

Can anyone reproduce this, and maybe even provide a suggestion as to why it happens?

This is simply an OutOfMemoryError that is not being printed. The program runs fine if I set a high heap size, otherwise it exits with an unlogged OutOfMemoryError (easy to see in a Debugger, though).

You can verify this and get a heap dump (as well as printout that an OutOfMemoryError occurred) by passing this JVM arg and re-running your program:

-XX:+HeapDumpOnOutOfMemoryError

With this it will then print out something to this effect:

java.lang.OutOfMemoryError: Java heap space
Dumping heap to java_pid4192.hprof ...
Heap dump file created [91901809 bytes in 4.464 secs]

Bump up your heap size with, say, -Xmx200m and you won't have an issue - At least for TARGET=1000000.

It sounds like the JVM itself crashes (that is the first thought when your program dies without a hint of an exception anyway). The first step in such a problem is to upgrade to the latest revision for your platform. The JVM should dump the heap to a .log file in the directory where you started the JVM, assuming your user level has access rights to that directory.

That being said, some OutOfMemory errors don't report in the main thread, so unless you do a try/catch (Throwable t) and see if you get one, it is hard to be sure you aren't actually just running out of memory. The fact that it only uses 100MB could just mean that the JVM isn't configured to use more. That can be changed by changing the startup options to the JVM to -Xmx1024m to get a Gig of memory, to see if the problem goes anywhere.

The code for doing the try catch should be something like this:

public static void main(String[] args) {
     try {
         MyObject o = new MyObject();
         o.process();
     } catch (Throwable t) {
         t.printStackTrace();
     }
 }

And do everything in the process method and do not store your cache in statics, that way if the error happens at the catch statement the object is out of scope and can be garbage collected, freeing enough memory to allow the printing of the stack trace. No guarantees that that works, but it gives it a better shot.

One significant difference between the two implmentations of size(long i) is in the amount of objects you are creating.

In the first implementation, there are no Objects being created. In the second you are doing an awful lot of autoboxing, creating a new Long for each access of your cache, and putting in new Long s and new Integer s on each modification.

This would explain the increase in memory usage, but not the absence of an OutOfMemoryError . Increasing the heap does allows it to complete for me.

From this Sun aritcle :

The performance ... is likely to be poor, as it boxes or unboxes on every get or set operation. It is plenty fast enough for occasional use, but it would be folly to use it in a performance critical inner loop.

If your java process suddenly crashes it could be some resource got maxed out. Like memory. You could try setting a higher max heap

Do you see a Heap Dump being generated after the crash? This file should be in the current directory for your JVM, that's where I would look for more info.

I am getting an OutOfMemory error on cache.put(i, size);

To get the error run your program in eclipse using debug mode it will appear in the debug window. It does not produce a stack trace in the console.

The recursive size() method is probably not a good place to do the caching. I put a call to cache.put(i, size); inside the main()'s for-loop and it works much more quickly. Otherwise, I also get an OOM error (no more heap space).

Edit: Here's the source - the cache retrieval is in size(), but the storing is done in main().

public static void main(String[] args) {
    long num = 0;
    int  maxSize = 0;

    long start = new Date().getTime();
    for (long i = 1; i <= TARGET; i++) {
        int size = size(i);
        if (size >= maxSize) {
            maxSize = size;
            num = i;
        }
        cache.put(i, size);
    }

    long computeTime = new Date().getTime() - start;
    System.out.println(String.format("maxSize: %4d on initial starting number %6d", maxSize, num));
    System.out.println("compute time in milliseconds: " + computeTime);
}

private static int size(long i) {
    if (i == 1l) {
        return 1;
    }

    if (cache.containsKey(i)) {
        return cache.get(i);
    }

    return size(process(i)) + 1;
}

Note that by removing the cache.put() call from size(), it does not cache every computed size, but it also avoids re-caching a previously computed size. This does not affect the hashmap operations, but like akf points out, it avoids the autoboxing/unboxing operations which is where your heap killer is coming from. I also tried a "if (!containsKey(i)) { cache.put() etc" in size() but that unfortunately also runs out of memory.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM