简体   繁体   中英

memory allocation of 100 bytes failed

I have some code that needs to create a large array. On my local computer (OSX) the program runs ok. However when I try to run the program on Ubuntu DigitalOcean droplet I get the following error:

memory allocation of 100 bytes failedAborted

There isn't any other information provided in the output, but I think it has to do with initializing the vector.

fn example() {
    let n = 25;
    let mut dp: Vec<Vec<f32>> = vec![vec![-1.0; n]; 2i32.pow(n as u32) as usize];
}

The size of that vector can get quite large in some instances. Is there a better way to create this large vector or is this caused by a system limit of memory?

You've requested a Vec that's too large and ran out of memory. When this happens, Rust will abort the program.

The failure to allocate 100 bytes, rather than >3GB is surprising, but it's probably because behavior of memory allocation on Linux is very unintuitive. Linux pretends to have an infinite amount of memory available, and will allow overly large allocations (overcommit) until it can't bluff any more.

You can mitigate this somewhat by using try_reserve :

let mut vec = Vec::new();
let size = 2.pow(25);

// this may fail, but won't hard abort
vec.try_reserve(size)?; 

// now you can safely add up to the `size` to the vec
vec.resize(size, -1.0); 

There is fallible_collections crate that has more OOM-handling methods for Vec and others.

Another thing to consider is setting your own hard limit for memory usage for your Rust program using the cap allocator wrapper . This lets you avoid wrath of Linux's OOM killer, and limit your program's allocations before they become a machine-wide problem.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM