简体   繁体   中英

C++ Program immediately using 2 GB of RAM: how to find culprit?

I have a program that, when queried on initialisation, is immediately using > 2 GB of RAM. Basically the code is like this:

#include <blah>

int main() {
    cout << get_mem_usage() << endl;
    //Lots of things happen, but no significant memory usage
    return 0;
}

Output: [2013-02-15 18:38:05.865283] 2147.71 Mb

I am, however, linking to a lot of different shared object files: I checked ldd and I am linking to 58 libraries, with a combined .so size of 66 MB.

I'm pretty sure that the get_mem_usage program is returning the correct values: these agree with top and massif output.

I should also mention that I am linking against the ROOT framework for High Energy Physics analysis.

I am using Linux x86_64, and the get_mem_usage() function is homegrown.

I would like to reduce the memory usage of the program, if at all possible.

My first guess: static initizations in some of the libraries you are linking.

Insert a long pause as the very first line of main() and look through the /proc/<pid>/ to see where the memory is allocated. For example:

  • /proc/12345/task/12345/maps
  • /proc/12345/task/12345/smaps

I could not find any documentation on get_mem_usage() .

What kind of memory usage is it reporting?

Determining memory usage in a virtual memory system is quite a challenge.

The actual amount of memory used is your resident set size . That is basically any physical memory you are using. Although with shared libraries (libc for an example), that memory consumption is (partially) shared with other applications as well.

Then there is a virtual set size - the total of any virtual memory mapped, any anonymous mappings, any other mapped files. Much of that will not really be backed with physical memory (ie not resident) and may be shared amongst other programs.

pmap -x <pid> will give you a neat table, including resident/dirty parts.

Overall, it is worth researching what 'uses' all that memory, but it might not be that much a problem at all. If you are running on a 32bit system, things might get a little tight anyway later on (as your virtual address space is limited).

Another possibility: if you have shared memory between your apps, then that shared memory counts against every app that accesses the shared memory segment, even though it is only allocated once. So if a 2gb shared memory segment is allocated somewhere, and 20 different applications use that shared memory, all 20 applications will seem as if they're using 2+gb of memory, making it look as if 40+gb of memory has been allocated.

In this case, it turned out to be a single library (that I wasn't directly using) that allocated 1.9 G memory. I found it by looking through /proc/12345/smaps

2aaab2197000-2aab2ba86000 rw-p 2aaab2197000 00:00 0 
Size:           1991612 kB

Looking up the address in /proc/12345/maps I found

2aaab2174000-2aaab2197000 rw-p 0016c000 4f9:2c566 59607963               /mnt/lustre/epp_scratch/atlas/sm442/Irvex/lhapdf-5.8.8/lib/.libs/libLHAPDF.so.0.0.0

This is a fortran library that was doing an enormous amount of static initialisation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM