简体   繁体   中英

Read large csv file in R

Would you like to tell how to read 3 gb csv file, because when I try to read with read.csv I stumble upon on this error “Error: cannot allocate vector of size 250.0 Mb”. My machine is i5 4 gb ram Win 7 32 bit. Moreover, I try to use ff package to open it and again I face error that I should use 64 bit version of R to be able to read csv file as well as I need more ram to read it. Should I take into account to use scan function. I try to use scan function but I want to use skip = 4 and nlines = 2 in my code and when I try to use it I read only first row of my data.

R operates completely in memory (RAM), so once it runs out of available space, the read.csv operation will fail hard. To get around this, you can try increasing the amount of available memory:

memory.limit(size=8097)   # allocate 8097 MB = 8 GB for R to use

You will need the 64 bit version of R to take advantage of this. And you may have to increase the memory beyond 8 GB to get your operation to work.

如果您尝试使用当前的内存管理方法完成需要过多随机存取内存的分析,那么我建议您调整分析以利用虚拟内存 ,只要您有足够的硬盘空间即可。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM