简体   繁体   中英

c# System.OutOfMemoryException: 'Exception of type 'System.OutOfMemoryException' was thrown.'

I get this System.OutOfMemoryException. YES: I have added

<runtime>
    <gcAllowVeryLargeObjects enabled = "true" />
</runtime>

to the App.Config file located in the project folder. I have a 64-bit Windows 10, about 20GB free ram of my 32GB. Visual-Studio 2022. image

I have also tried to set x64 everywhere I can inside Visual Studio.

The code that generates my Exception is:

List<int> dataList = new List<int>();
for (int j = 0; j < int.MaxValue - 10; j++)
{
    dataList.Add(j);
    if ((j %100000) == 0 )
    { Console.WriteLine(j.ToString("N0")); }
}
Console.WriteLine($"Total items in the List: {dataList.Count.ToString("N0")}. ");

The last printout on the console is: 2,146,400,000. I never get to the part of writing dataList.Count . What can be done to increase the memory allocated to C# and/or Visual Studio?

The last printout on the console is: 2,146,400,000.

The List<T> type uses a doubling algorithm to manage capacity. Each time it reaches capacity, it allocates a new internal array twice the size of the old one, and then manually copies the elements from the old array to the new.

So the first thing we learn is, when you know the final size (as you do here), letting the list grow itself is incredibly inefficient. Instead, construct the list with the desired capacity in mind:

var listSize = int.MaxValue - 10;
var dataList = new List<int>(listSize);
for (int j = 0; j < listSize; j++)

This will not only save the extra allocations, but also save all that work copying elements between buffers. And it tells you up front (via exception) if you're gonna run out of RAM, instead of making the computer do a bunch of work before finally failing.

The second thing we should notice is every time the list doubles, it briefly needs sufficient memory for both arrays . To add the next element to a list at a full capacity of a little over 2,146,400,000, we briefly need space for a little over 6,439,200,000 elements (the current array plus the new array, which is twice the size). Since.Net integers are 32-bits (4 bytes) each, that's roughly 24GB of RAM.

The next thing to understand is the difference between available memory and available address space . I've heard the garbage collector has gotten better at this since I last did a deep dive, but there's still a concern here. Your process may be able to use all the memory available on your system (and then some, thanks to paging), but it still has to address it. Running out of address space will also throw an OutOfMemoryException.

Every time the list doubles, when the garbage collector releases the RAM from the old internal array that RAM does go back to the OS, but there's a still a hole in the address space for the process. The garbage collector is able to clean these holes up through a process called "compaction", but it hasn't always been very good at this step, and even when it works that's another level of inefficiency. Again, setting the capacity in advance (if you know it) can really help.


But what if we could do even better. Let's look at an alternative. I suggest Enumerable.Range() :

var items = Enumerable.Range(0, int.MaxValue - 10);
foreach(var j in items)
{
    if (j % 100000 == 0)
    {
        Console.WriteLine(j.ToString("N0")); 
    }
    // do something else with every j
}

The above code will have effectively the same output, and uses less than 50 bytes of RAM ( 4 bytes for j , 4 bytes for the 100000 constant, 4 bytes each for MaxValue and 10 , a 20 byte object reference to an internal items state machine, a 4 byte integer for the internal state machine's current state, and a 4 byte integer for the terminal state -- I may be missing something else, but it's small.)

This is a savings of more than 515,000,000X !

In fact, it might not even use that much, if the compiler decides to use registers for any of those integers.

Even this much presumes a defined sequence of some kind. There's no reason we couldn't also use a traditional for loop, without a list or range at all, which would save even more.


Update:

I may be missing something else, but it's small.

Yes, I'm missing the size of the "N0" format string (~32 bytes -- 20 byte object reference, 4 byte integer for the length, and two 4 byte unicode characters) and the size of each of the output strings, which must also exist in memory... but still only one at a time. I estimate the largest of these strings at 76 bytes, plus the 32 bytes for N0 , means we're using about 152 bytes total. So the saving is only a factor of 169 million. Only.

But as a practical matter the garbage collector isn't going to clean up these string objects after every single loop iteration, so actual measured use will be up somewhat.

Given the List Capacity property accepts and returns an int, that may be a clue that List is limited to around 2,147,483,647 elements (although I have not found anything in the docs to verify this).

So, one solution might be to have a List of Lists, for example:

     const int k_list_size_max = 2;
     const int k_data_items_max = 2000000000;

     UInt64 total_items;

     List<int> data_items_list;

     List<List<int>> data_list = new List<List<int>>( k_list_size_max );

     total_items = 0;

     for ( int i = 0; i < k_list_size_max; i++ )
     {

        data_items_list = new List<int>( k_data_items_max );

        for ( int j = 0; j < k_data_items_max; j++ )
        {

           data_items_list.Add( j );

        }

        Debug.WriteLine( $"List[{i}] data_items_list.Count = {data_items_list.Count}" );

        total_items += ( UInt64 )data_items_list.Count;

     }

     Debug.WriteLine( $"Total items: {total_items}. " );

  

Output:

List[0] data_items_list.Count = 2000000000
List[1] data_items_list.Count = 2000000000
Total items: 4000000000. 

What can be done to increase the memory allocated to C# and/or visual-Studio?

As your goal seems to be allocating memory, you can do that much better if you use larger structures. Just define a struct with a few doubles or guids and then make a list of these structs.

Then your 2 billion elements will easily use more than 32GiB.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM