简体   繁体   中英

R on MacOS Error: vector memory exhausted (limit reached?)

I'm trying to run an R script (in particular, I am using the "getLineages" function from the Bioconductor package, Slingshot .

I'm wondering why the error "vector memory exhausted (limit reached?)" is showing up when I use this function, as it doesn't seem to be the most memory-intensive function compared to the other functions in this package (with the data I am analyzing).

I do understand that there are other questions like this on Stackoverflow, but they all suggest to switch over to the 64-bit version of R. However, I am already using this version. There seem to be no other answers to this issue so far, I was wondering if anyone might know?

The data is only ~120mb in size, which is far less than my computer's 8GB of RAM.

R 64 位版本

For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000) , as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:

Error: vector memory exhausted (limit reached?)

After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:

Step 1: Open terminal,

Step 2:

cd ~
touch .Renviron
open .Renviron

Step 3: Save the following as the first line of .Renviron :

R_MAX_VSIZE=100Gb 

Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine

I had the same problem, increasing the "R_MAX_VSIZE" did not help in my case, instead cleaning the variables no longer needed solved the problem. Hope this helps those who are struggling here.

rm(large_df, large_list, large_vector, temp_variables)

This can be done through R studio as well.

library(usethis) 
usethis::edit_r_environ()

when the tab opens up in R studio, add this to the 1st line: R_MAX_VSIZE=100Gb (or whatever memory you wish to allocate).

Re-start R and/or restart computer and run the R command again that gave you the memory error.

I had this problem when running Rcpp::sourceCpp("my_cpp_file.cpp") , resulting in

Error: vector memory exhausted (limit reached?)

changing the Makevars file solved it for me. Currently it looks like

CC=gcc
CXX=g++
CXX11=g++
CXX14=g++
cxx18=g++
cxx1X=g++
LDFLAGS=-L/usr/lib 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM