简体   繁体   中英

Tomcat memory consumption is more than heap + permgen space

I am observing a mismatch in Tomcat RAM consumption between what the OS says and what jVisualVM says.

From htop, the Tomcat JVM is has 993 MB of resident memory

From jVisualVM, the Tomcat JVM is using

  • Heap Max: 1,070,399,488 B
  • Heap Size: 298.438.656 B
  • Heap Used: variable, between 170MB and and 270MB
  • PermGen Max: 268,435,456 B
  • PermGen Size: 248,872,960 B
  • PermGen Used: slightly variable, around 150MB

From my understanding the OS memory consumption should be Heap Size + PermGen Size ~= 522 MB. But that's 471 MB less than what I'm observing.

Anyone got an idea what am I missing here?

PS: I know that my max heap is much higher than what is used, but I'm assuming that should have no effect if the JVM does not use it (ie Heap Size is lower).

Thanks! Marc

From my understanding the OS memory consumption should be Heap Size + PermGen Size ~= 522 MB. But that's 471 MB less than what I'm observing. Anyone got an idea what am I missing here?

If I understand the question what you are seeing is a combination of memory fragmentation and JVM memory overhead in other areas. We often see 2 times the memory usage for our production programs than we would expect to see from our memory settings.

Memory fragmentation can mean that although the JVM thinks that the OS has given it some number of bytes, there is a certain addition number of bytes that had to be given because of memory subsystem optimizations.

In terms of JVM overhead, there are a number of other storage areas that are not included in the standard memory configs. Here's a good discussion about this . To quote:

The following are examples of things that are not part of the garbage collected heap and yet are part of the memory required by the process:

  • Code to implement the JVM
  • The C manual heap for data structures implementing the JVM
  • Stacks for all of the threads in the system (app + JVM)
  • Cached Java bytecode (for libraries and the application)
  • JITed machine code (for libraries and the application)
  • Static variables of all loaded classes

The first thing we have to bear in mind is that: JVM process heap (OS process) = Java object heap + [Permanent space + Code generation + Socket buffers + Thread stacks + Direct memory space + JNI code + JNI allocated memory + Garbage collection] , where in this "collection" permSpace is usually the bigest chunk.

Given that, I guess the key here is the JVM option -XX:MinFreeHeapRatio=n , where n is from 0 to 100, and it specifies that the heap should be expanded if less than n% of the heap is free. It is usually 40 by default (Sun), so when the JVM allocates memory, it gets enough to get 40% free ( this is not applicable if you have -Xms == -Xmx ). Its "twin option", -XX:MaxHeapFreeRatio usually defaults to 70 (Sun).

Therefore, in a Sun JVM the ratio of living objects at each garbage collection is kept within 40-70%. If less than 40% of the heap is free after a GC, then the heap is expanded. So assuming you are running a Sun JVM, I would guess that the size of the "java object heap" has reached a peak of about 445Mb, thus producing an expanded "object heap" of about 740 Mb (to guarantee a 40% free). Then, (object heap) + (perm space) = 740 + 250 = 990 Mb.

Maybe you can try to output GC details or use jconsole to verify the evolution of the heap size.

PS: when dealing with issues like this, it is good to post OS and JVM details.

During the startup of your application the JVM will reserve memory equal to roughly the size of your Heap Max value (-Xmx) plus a bit more for other stuff. This prevents the JVM from having to go back to the OS to reserve more memory later.

Even if your application is only using 298mb of heap space, there will still be the 993mb reserved with the OS. You will need to read more into reserved vs committed memory.

Most of the articles you will read when talking about garbage collection will refer to allocation from a heap perspective and not the OS level. By reserving the memory at start-up for your application, the garbage collection can work in its own space.

If you need more details, read the article Tuning Garbage Collection Here are some important exerts from the document

At initialization, a maximum address space is virtually reserved but not allocated to physical memory unless it is needed.

Also look at section 3.2 (iv) in the document

At initialization of the virtual machine, the entire space for the heap is reserved. The size of the space reserved can be specified with the -Xmx option. If the value of the -Xms parameter is smaller than the value of the -Xmx parameter, not all of the space that is reserved is immediately committed to the virtual machine.

The OS will report the memory used by the JVM + the memory used by your program. So it will always be higher than what the JVM reports as memory usage. There is a certain amount of memory need by the JVM itself in order execute your program and the OS can't tell the difference.

Unfortunately using the system memory tools isn't a very precise way to track your programs memory consumption. JVM's typically allocate large blocks of memory so object creation is quick, but it doesn't mean your program is consuming that memory.

A better way of knowing what your program is actually doing is to run jconsole and look at the memory usage there. That's a very simple tool for looking at memory that's easy to set up.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM