简体   繁体   English

我如何分析 35-40 GB 左右的大型堆转储

[英]How can I analyse large size heap dump around of 35-40 GB

I have to analyse java heap dump of size 35-40GB, which can't be loaded on local machine except of remote servers of large memory.我必须分析大小为 35-40GB 的 java 堆转储,除了 memory 的大型远程服务器外,无法在本地机器上加载它。

I found Tool for analyzing large Java heap dumps as the best link till now.我发现Tool for analyzing large Java heap dumps是迄今为止最好的链接。 But after configuring all the things and properly executing all the command lines, I was not able to get any report file.但是在配置所有东西并正确执行所有命令行之后,我无法获得任何报告文件。

My ParseHeapDump.sh file looks as我的ParseHeapDump.sh文件看起来像

#!/bin/sh
#
# This script parses a heap dump.
#
# Usage: ParseHeapDump.sh <path/to/dump.hprof> [report]*
#
# The leak report has the id org.eclipse.mat.api:suspects
# The top component report has the id org.eclipse.mat.api:top_components
#
./MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$@" -vmargs -Xms8g -Xmx10g -XX:-UseGCOverheadLimit

and MemoryAnalyzer.ini file looks as MemoryAnalyzer.ini文件看起来像

-startup
plugins/org.eclipse.equinox.launcher_1.5.0.v20180512-1130.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.700.v20180518-1200
java -Xmx8g -Xms10g -jar plugins/org.eclipse.equinox.launcher_1.5.0.v20180512-1130.jar -consoleLog -consolelog -application org.eclipse.mat.api.parse "$@"
-vmargs
-Xms8g
-Xmx10g

Please tell me If I'm doing any mistake in configuration or suggest me any other tool available in the market.请告诉我,如果我在配置中犯了任何错误,或者建议我使用市场上可用的任何其他工具。

Processing large heap dump is a challenge.处理大型堆转储是一项挑战。 Both VisualVM and Eclipse Memory Analyzer required too much memory to process heap dumps in order of few dozen of GiB. VisualVM 和 Eclipse Memory Analyzer 需要太多 memory 才能按几十 GiB 的顺序处理堆转储。

Commercial profilers show better result (YourKit in particular) though I not sure of their practical limit.商业分析器显示出更好的结果(尤其是 YourKit),尽管我不确定它们的实际限制。

To routinely process 100+ GiB, I came up you with headless solution heaplib , which based on code base from VisualVM (Netbeans actually).为了例行处理 100+ GiB,我为您提出了无头解决方案heaplib ,它基于 VisualVM(实际上是 Netbeans)的代码库。

Heaplib is neigther graphical, nor interactive. Heaplib 不是图形化的,也不是交互式的。 It is oriented to automated reporting.它面向自动报告。 Tool allows you to write code for heap analysis in OQL/JavaScript (or Java if you wish), though capabilities are limited to accommodate memory requirements.该工具允许您使用 OQL/JavaScript(或 Java,如果您愿意)编写堆分析代码,但功能仅限于满足 memory 要求。 Processing of 100GiB could take hour, but for non interactive workflow it is acceptable.处理 100GiB 可能需要一个小时,但对于非交互式工作流来说,这是可以接受的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 JMAP转储大小为4.5 GB。 Eclipse MAT显示的总堆为415 MB,我如何分析剩余的转储? - JMAP dump size is 4.5 GB. Eclipse MAT is showing total heap of 415 MB, how do i analyze the remaining dump? java中如何使用jmap分析堆转储 - How to analyse the heap dump using jmap in java 如何分析~13GB的数据? - How can I analyse ~13GB of data? 当我加载大约40mb大小的11k图像时,为什么会出现java堆空间异常 - Why do I get a java heap space exception when I load around 11k images with a size of around 40mb 如何在 IntelliJ 中分析堆转储? (内存泄漏) - How can I analyze a heap dump in IntelliJ? (memory leak) 堆转储中有很多hibernate查询。 我怎么限制? - Lots of hibernate queries in heap dump. How can I limit? 如何在JVisualVM的堆转储中计算“大小” - How is “Size” calculated in JVisualVM's Heap Dump 如果我将最大堆大小设置为2 Gb,那么堆转储大小将是多少? - What will be the heapdump size if I set the Max Heap size = 2 Gb? 我的应用程序在我的设备上占用250mb内存。 当我进行堆转储并进行分析时。 它说堆大小约为7mb - My App is taking 250mb memory on my device. When I do a heap dump and analyze it. it says the heap size around 7mb 我应该使用 60GB 堆大小吗? - Shoud I use 60GB heap size?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM