简体   繁体   中英

Analysing large Java heap dumps - memory error

I have a very peculiar problem. I have a heap dump of 30 GB and I want to analyze the same on my laptop (which has 8 GB of RAM). I tried doing that with MAT and IBM Heap analyzer, but as per their recommendation the Xmx size should be more than the dump size. I also tried to analyze the heap dump with the heapDumpParser.bat file of MAT but received memory error.

Any suggestions on how I can analyze the dump on my laptop successfully?

Thanks in advance!

Memory Analyzer is probably the best tool for analysing out of memory issues but it does require a lot of memory.

If you are unable to find a machine large enough to run to handle your dump you could try using the jdmpview command line tool that ships with the IBM SDK to perform some basic investigation.

It will work best with the core dumps generated on out of memory rather than the phd files as it does not need to load the contents into memory.

You can find it in jre/bin and need to run:

jdmpview -core core_file_name

You should probably start by running the command:

info class

as that will generate a basic list of object types, instance counts and sizes.

There are full docs here: http://www-01.ibm.com/support/knowledgecenter/SSYKE2_8.0.0/com.ibm.java.win.80.doc/diag/tools/dump_viewer_dtfjview/dump_viewer.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM