简体   繁体   中英

Trying to load 150+ grayscale 4096 x 4096 bitmaps. Need help getting around the 2GB limit, I think

Solved: I assumed, incorrectly, that Visual Studio 2012 defaulted to building 64-bit apps when making new projects. Under Solution Properties, Build tab, there is a checkbox marked, "Prefer 32-bit". When I unchecked this checkbox and rebuilt my solution, I was able to load and process over 200 bitmaps at a time. Thanks to Mike Trusov for so politely pointing out the solution to me.

Original question: I've run into a small problem. I have more than 150 grayscale bitmaps that I want to load into RAM on an 8 GB system, but I can't seem to get past 50 or so without throwing an exception. I've tried loading the bitmaps into an array, which is a lost cause, of course, because .NET has a 2GB limit. But it failed long before 2GB worth of bitmaps were loaded. So I tried loading them into a List<List<Bitmap>> , but even that fails at about the same place. At the moment of the exception, Task Manager says I have 3939 MB of Available RAM waiting to be filled. Very strange.

I don't need these bitmaps to be contiguous in RAM. They can be a scattered bunch of 16 MB allocations for all I care. I just want to be able to fill available RAM with a bunch of bitmaps. That's all.

The exception has at various times been an OutOfMemory exception or an ArgumentException, depending on how much available RAM I had when I started the program. In either case, the stack trace dies inside System.Drawing.Bitmap..ctor(String filename) . There is nothing wrong with the specific file being loaded at the time of the exception. In fact, when I have it load a different (or even overlapping) set of bitmaps, the error occurs at the same iteration.

Does anyone have a clue they can lend me on how to do this? Am I running into the .NET 2GB limit in some strange way?

To respond to a few questions and comments: I'm using Visual Studio 2012, .NET 4.5, 64-bit Windows 7, on a computer with 8 GB of RAM. Yes, I need all those bitmaps in RAM at the same time for a variety of reasons (performance, image processing reasons, etc). I have pondered using gcAllowVeryLargeObjects, but I don't need or want all my bitmaps in a long chunk of contiguous memory. I would much rather each Bitmap used its own separate memory allocation. Besides, if I had a machine with 64 GB of RAM, it would be absurd to be limited to even 150 Bitmaps of that size. Why won't these bitmaps load without throwing an OutOfMemoryException?

To me, it seems that .NET is trying to keep all Bitmaps in a single 2 GB region. If there was a way to get each Bitmap to (saying more than I know here) have its own separate address space, that might solve the problem. To invoke the language of the long ago days of MS-DOS, I want to allocate and access far memory using a long pointer, not have all my data stuck in a single near segment.

Here is the array code:

    List<String> imageFiles;            // List of .bmp filenames.
    Bitmap[] bmps = new Bitmap[100];    // Stores/accesses the Bitmaps.

    private void goButton_Click(object sender, EventArgs e)
    {
        int i;

        // Load the bitmaps
        if (bmps[0] == null)
        {
            // Load the list of bitmap files.
            imageFiles = Directory.EnumerateFiles(@"C:\Temp", "*.bmp", SearchOption.TopDirectoryOnly).ToList();

            // Read bitmap files
            for (i = 0; i < bmps.Length; ++i)
            {
                bmps[i] = new Bitmap(imageFiles[i]);    // <-- Exception occurs here when i == 52 or so.
            }
        }
    }

Here is the List> code:

    List<String> imageFiles;                                // List of .bmp filenames.
    List<List<Bitmap>> bmps = new List<List<Bitmap>>(100);  // Stores/accesses the Bitmaps.

    private void goButton_Click(object sender, EventArgs e)
    {
        int i;

        // Load the bitmaps
        if (bmps.Count == 0)
        {
            // Load the list of bitmap files.
            imageFiles = Directory.EnumerateFiles(@"C:\Temp", "*.bmp", SearchOption.TopDirectoryOnly).ToList();

            // Read bitmap files
            for (i = 0; i < 100; ++i)
            {
                // Load the bitmap into temporary Bitmap b.
                Bitmap b = new Bitmap(imageFiles[i]);   // <-- Exception occurs here when i == 52 or so.

                // Create and add a List<Bitmap> that will receive the clone of Bitmap b.
                bmps.Add(new List<Bitmap>(1));

                // Clone Bitmap b and add that cloned Bitmap to the Bitmap of List<Bitmap>.
                bmps[i].Add((Bitmap)b.Clone());

                // Dispose Bitmap b.
                b.Dispose();
            }
        }
    }

There should be no issues loading more than 2gb of bitmaps into memory in a 64bit app (check project settings - might have to create a new configuration for Any CPU based on x86) running on 64bit OS (I'm guessing you are). Also a simple list should work:

    var imageFiles = Directory.EnumerateFiles(@"C:\Temp", "*.bmp", SearchOption.TopDirectoryOnly).ToList();
    var lst = new List<Bitmap>();
    foreach (var imageFile in imageFiles)
    {
        lst.Add(new Bitmap(imageFile));
    }

Do they ALL have to be loaded at the same time? Could you load say, 20 of them, then while you are processing or displaying those, you have a background thread prep the next 20.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM