简体   繁体   English

glm用完了64位R中的内存?

[英]glm running out of memory in 64-bit R?

I am trying to run glm on a dataset with 255001 data points, but it's saying 我正在尝试在具有255001个数据点的数据集上运行glm,但这是说

Error: cannot allocate vector of size 10.0 Gb

This is very strange because when I start up R, I see the message 这很奇怪,因为当我启动R时,我看到消息

R version 3.1.1 (2014-07-10) -- "Sock it to Me"
Copyright (C) 2014 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

Which seems to indicate that I'm running a 64 bit version of R, and I read that the memory limit for 64 bit versions of R on Unix is on the order of 128 Tb. 这似乎表明我正在运行R的64位版本,并且我读到在Unix上R的64位版本的内存限制约为128 Tb。

Furthermore, I have successfully run glm logistic regression on very similar datasets that are twice as large without any problem. 此外,我已经成功地对非常大的相似数据集运行了glm logistic回归,而数据集的大小是原来的两倍,没有任何问题。

How can I reconcile these facts, and how can I get R to hold large objects in memory? 如何调和这些事实,如何让R将大对象保存在内存中?

It turns out there was a bug in my code, and when I was reading in the data, I set header=FALSE instead of header=TRUE . 事实证明,我的代码中存在一个错误,当我读取数据时,我设置header=FALSE而不是header=TRUE Changing this fixed the problem. 更改此选项可解决问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM