简体   繁体   中英

OutOfMemoryException in C#

This code causes some kind of memory leak. I assume it's caused by the new byte[] . But shouldn't the GC avoiding this? If the program runs long enough, the code will cause a OutOfMemoryException

using (var file = new FileStream(fileLoc, FileMode.Open))
{
    int chunkSize = 1024 * 100;
    while (file.Position < file.Length)
    {
        if (file.Length - file.Position < chunkSize)
        {
            chunkSize = (int)(file.Length - file.Position);
        }
        byte[] chunk = new byte[chunkSize];
        file.Read(chunk, 0, chunkSize);
        context.Response.BinaryWrite(chunk);
    }
}

The problem is almost certainly that you're repeatedly allocating new arrays and in memory they're allocated as contiguous blocks, so I can understand how it's chewing through it.

How about rejigging things slightly so that you only create the buffer once and then reuse it unless you get into the if where the chunksize required is less than the standard chunk size.

using (var file = new FileStream(fileLoc, FileMode.Open)) {
    int chunkSize = 1024 * 100;
    byte[] chunk = new byte[chunkSize];

    while (file.Position < file.Length) {
        if (file.Length - file.Position < chunkSize) {
            chunkSize = (int)(file.Length - file.Position);
            chunk = new byte[chunkSize];
        }
        file.Read(chunk, 0, chunkSize);
        context.Response.BinaryWrite(chunk);
    } 
}

May I suggest that you try with a smaller buffer size?

The problem could be that you're repeatedly allocating a memory block that is larger than 85000 bytes and that goes to a special heap (Large Object Heap) that unfortunately is never compacted!

See here for a detailed explanation of how the Large Object Heap works. This unfortunately can lead to severe heap fragmentation and ultimately cause an out of memory error like the one you're describing (see here: loh fragmentation causes OutOfMemory exception )

If you allocate smaller chunks (smaller than 85,000 bytes) then they will be allocated on the regular heap, then the GC will be able to perform compaction and almost certainly, your problem will be gone. I would also strongly recommend you modify your code as suggested by @Nanhydrin, since this avoids repeated allocations and should perform slightly better

I'm not entirely sure what you mean by "runs long enough" however that code allocates an array that is at least 100 KB (possibly larger if the file is larger). In itself this probably won't cause a failure however in an environment with only 32 MB of virtual address space this is a reasonably large allocation of memory. If there are many of these running in parallel this could easily multiply that up to a relatively high memory usage in which case you may see an OutOfMemoryException .

With the assumption that context.Response is a HttpResponse it looks like you are just trying to write the contents of a file to a HTTP response in which case you can do this far more efficiently using something like the following:

using (var file = new FileStream(fileLoc, FileMode.Open))
{
    CopyStream(file, context.Response.OutputStream);
}

See Best way to copy between two Stream instances - C# for an implementation of CopyStream that copies data bit-by-bit in smaller chunks rather than attempting to read the entire file in one go.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM