简体   繁体   中英

Compliler out of heap size for large array

So I have a double array of 15,000,000 elements that during runtime, random subsets of 2000 elements from the array need to be extracted for processing.

I've tried initialising the array using the following:

static const double myArray[15000000] = {-2.1232, -6.4243, 23.432, ...};

However during runtime I get the error "C1060 compiler is out of heap space". In Visual Studio 2019, I've went into the project properties -> linker -> System and modified the Heap Reserve Size to "8000000000" which I assumed would be large enough and I have 16GB on my machine, but I still return the same error. I've also tried using the x64 compiler but to no avail.

I've also tried writing the array to a csv, and then binary file and reading from that instead during runtime. However, the read process takes far too long, as I'm required to read from it, ideally, several times a second.

I'm relatively new to C++, but especially new when it comes to memory allocation. What would you suggest as a solution?

If you have your 15M doubles in binary format, you can embed that into your binary and reference it. The run-time cost is just a bit more disk IO when first loading your binary, but that should be much faster than parsing a CSV.

the problem may be that you have enough memory but it not a consequence. so my suggestion is to use std::list

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM