简体   繁体   中英

Why the ffdf object is so large?

I use read.csv.ffdf from ff package to load a 830MB CSV file, which is about 8800000 rows and 19 columns:

library(ff)
library(ffbase)
green_2018_ff <- read.csv.ffdf("green_2018.csv", header = TRUE)

But when I check the the size of green_2018_ff using object_size from pryr package, the object is about 1.13GB in memory:

library(pryr)
object_size(green_2018_ff)  #1.13GB

I used to consider that the ffdf is only a memory mapping object, it should be very small in memory, much smaller than the origin CSV. Is there anything wrong with my code or data? Thanks.

It's the call to object.size itself which pulls your data in RAM

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM