简体   繁体   中英

C# Confusion about out of memory

I have been reading about out of memory for some time now and I figured out that in most cases out of memory exception (at least in .NET) isn't really caused by system actually running out of memory but rather system could not allocate chunks of requested memory block due to fragmentation.

What I don't really understand is I've been in a situation where I still get out of memory exception even if I try to allocate a large chunk of contiguous memory on application startup (eg: loading 100 images). Since the application has just started up, it is assumed that not much allocations / de-allocations have been done prior to that, so there should be many free contiguous blocks available. In that case why would the application still get hit by memory fragmentation issue?

Note that I'm also fairly certain that the issue was not caused by the system actually running out of memory quota allocated for my application because loading 100 images in my specific case only takes ~200 mb or so.

In my experience, Out of Memory mostly means poor object management. It's symptomatic of creating too many objects too fast and GC is having a hard time keeping up. Setting aside the few projects that take and never give memory back (like a SQL Server) out of memory can be prevented with caching and a well defined object life cycle.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM