简体   繁体   English

Java 性能逐渐下降

[英]Java performance gradually degrades

I have a Java class that does something like this:我有一个 Java 类,它执行以下操作:

public void slowsDownOverTime() {
    for (int i=0 ; i<nIter ; i++) {
        BigObject bigObject = new BigObject();
        // some code here that populates big object ...
        CustomSerializer.write(bigObject);
    }
}

What I observe is that as the code iterates, the time needed by the serializer to write gets longer and longer.我观察到的是,随着代码迭代,序列化程序编写所需的时间越来越长。 When it starts, the serializer runs in milliseconds;启动时,序列化程序以毫秒为单位运行; after a few tens of thousands of iterations, it takes several seconds to run.经过几万次迭代,运行需要几秒钟。

The disk to which the serializer writes is nowhere near full, and the Java heap space in use is nowhere near its maximum when this happens.序列化程序写入的磁盘远未满,并且在发生这种情况时,正在使用的 Java 堆空间远未达到最大值。

To the extent possible I've reduced the number and size of objects created and destroyed during this cycle.我尽可能减少了在此周期中创建和销毁的对象的数量和大小。 That basically exhausts my toolkit for addressing this sort of problem!这基本上耗尽了我解决此类问题的工具包!

Any suggestions for how I understand and correct the gradual performance degradation would be greatly appreciated!任何关于我如何理解和纠正逐渐性能下降的建议将不胜感激!

I think, this is caused by the code left out (some code here that populates big object ...).我认为,这是由遗漏的代码引起的(这里的一些代码填充了大对象......)。 Try this:尝试这个:

public void slowsDownOverTime() {
    BigObject bigObject = new BigObject();
    // some code here that populates big object ...
    for (int i=0 ; i<nIter ; i++) {
        CustomSerializer.write(bigObject);
    }
}

This will always write the same object and I expect this not to degrade in performance.这将始终写入相同的对象,我希望这不会降低性能。

I think the left out code builds a growing data structure that is being referenced by bigObject .我认为被遗漏的代码构建了一个正在被bigObject引用的不断增长的数据结构。 Keep in mind that, when serializing, Java traverses all dependent objects down to depth and serializes also all dependent objects.请记住,在序列化时,Java 会深入遍历所有依赖对象并序列化所有依赖对象。 So it will write more and more data with each iteration.所以每次迭代都会写入越来越多的数据。 This can be the cause for degarding performace and use of much disk space.这可能是降低性能和使用大量磁盘空间的原因。

After much fruitless mucking about with jvisualvm's profiler, I resorted to adding a ton of logging.在对 jvisualvm 的分析器进行了多次无果而终的尝试之后,我求助于添加大量日志记录。 This gave me a clue that the problem was in a Hibernate operation that was performed to populate the BigObject.这给了我一个线索,即问题出在为填充 BigObject 而执行的 Hibernate 操作中。 I was able to fix the performance issue by evicting each object that I retrieved as soon as I retrieved it.我能够通过在我检索到的每个对象后立即驱逐它来解决性能问题。

I'm not much of a Hibernate expert, but I think that what was happening is that even though the objects were going out of scope in my code (and thus getting garbage collected), Hibernate was keeping a copy of each object in its cache.我不是 Hibernate 专家,但我认为发生的事情是,即使对象超出了我代码中的范围(因此被垃圾收集),Hibernate 仍在其缓存中保留每个对象的副本. Then when it performed the next retrieval, it compared the retrieved object to all the cached ones.然后当它执行下一次检索时,它将检索到的对象与所有缓存的对象进行比较。 As the cache population rose, this operation would take longer and longer.随着缓存数量的增加,此操作将花费越来越长的时间。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM