简体   繁体   中英

Running out of memory while analyzing a Java Heap Dump

I have a curious problem, I need to analyze a Java heap dump (from an IBM JRE) which has 1.5GB in size, the problem is that while analyzing the dump (I've tried HeapAnalyzer and the IBM Memory Analyzer 0.5 ) the tools runs out of memory I can't really analyze the dump. I have 3GB of RAM in my machine, but seems like it's not enough to analyze the 1.5 GB dump,

My question is, do you know a specific tool for heap dump analysis (supporting IBM JRE dumps) that I could run with the amount of memory I have?

Thanks.

Try the SAP memory analyzer tool, which also has an eclipse plugin . This tool creates index files on disk as it processes the dump file and requires much less memory than your other options. I'm pretty sure it supports the newer IBM JRE's. That being said - with a 1.5 GB dump file, you might have no other option but to run a 64-bit JVM to analyze this file - I usually estimate that a heap dump file of size n takes 5*n memory to open using standard tools, and 3*n memory to open using MAT, but your milage will vary depending on what the dump actually contains.

It's going to difficult to analyze 1.5GB heap dump on a 3GB RAM. Because in that 3GB your OS, other processes, services,... easily would occupy 0.5 GB. So you are left with only 2.5GB. heapHero tool is efficient in analyzing heap dumps. It should take only 0.5GB more than the size of heap dump to analyze. You can give it try. But best recommendation is to analyze heap dump on a machine which has adequate memory OR you can get an AWS ec2 instance just for the period of analyzing heap dumps. After analyzing heap dumps, you can terminate the instance.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM