简体   繁体   中英

R: Understanding how object.size() relates to memory use

I have run a number of analyses and saved all the 24x8 ggplots into a list.

When I inspect the object size I get:

print(object.size(output_list), units = "Mb")
115.3 Mb

If I save the object as a rds-file the object takes up around 650 Mb.

If I inspect the memory use of the object I get:

library(pryr)
mem_used() # Note: In example I only have this object in the R memory
28.2 GB

I am not sure why I get these HUGE difference in sizes, and I have not been able to find a clear explaination on the webs yet. Can you help me understand why this seemingly small object takes up so much of the memory?

I have an idea that it is because all the plots store all data points in their plotData? But wouldn't that also affect the object.size? I also considered that it is due to the tree structure, as eg in the data.tree package each note saves an individual environment.

As per help, mem_used sums the Ncells and Vcells -- Vcells (memory used by vectors) and Ncells (memory used by everything else) ).

This seems to work fine on my laptop.

Best to do gc() as well to get a better idea.

rm(list = ls())
m <- 1:100000000

object_size(m)
#400 MB

mem_used()
#454 MB

gc()
#           used  (Mb) gc trigger  (Mb)  max used  (Mb)
#Ncells   753450  40.3    1442291  77.1   1442291  77.1
#Vcells 51417867 392.3  121970165 930.6 101419240 773.8

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM