简体   繁体   中英

Suppressing C# garbage collection

My application allocates a large amount of memory (millions of small objects totaling several gigabytes) and holds onto it for a long time.

  1. Is .NET wasting time checking through all of this data to do GC on it?
  2. How often does the Gen 2 GC occur (the one that checks all objects)?
  3. Is there any way to reduce it's frequency or temporarily suppress it from occurring?
  4. I know exactly when I am ready for a large amount of memory to be collected, is there any way to optimize for that? I am currently calling GC.Collect(); GC.WaitForPendingFinalizers(); at that time.

Update: Perf counter "% Time in GC" is showing an average of 10.6%.

Unless you can confirm that the garbage collector is actively slowing the performance of your application, you should not take steps to cripple the functionality of your runtime environment.

Judging from your question, you have not confirmed that the GC is a problem. I severely doubt that it is.

Optimize only what needs to be optimized.

Look at the System.Runtime.GCSettings.LatencyMode property.

Setting the GCServer property to true in the app.config will also help cut down on GC's (in my case 10 times less GC when enabled).

You can stop the garbage collector from finalizing any of your objects using the static method:

GC.SuppressFinalize(*your object*)

More information here: link text

You can measure this using Performance Monitor. Open perfmon and add the .NET CLR Memory related performance counters. These counters are process specific and with them you can track the number of collections and sizes of the various generations and more spefically for you the "% Time in GC". Here is the explain text for this counter:

% Time in GC is the percentage of elapsed time that was spent in performing a garbage collection (GC) since the last GC cycle. This counter is usually an indicator of the work done by the Garbage Collector on behalf of the application to collect and compact memory. This counter is updated only at the end of every GC and the counter value reflects the last observed value; its not an average.

If you watch these counters while running your program, you should have answer for the frequency and cost of the GC due to your memory decisions.

Here is a good discussion of the various GC Performance Counters. It seems that 10% is borderline okay.

It will only (usually) happen when the GC needs some gen2 memory anyway (because gen1 is full). Are you asking this speculatively, or do you actually have a problem with GC taking a large proportion of your execution time? If you don't have a problem, I suggest you don't worry about it for the moment - but keep an eye on it with performance monitors.

My ASP.NET application - B2B system - used to start at 35-40MB when the first user came to it. After so minutes, the application used to grow up to 180 MB with 2 or 3 users hitting pages. After reading .net development best practices and GC performance guideline I find out that the problem was my application design. I did not agree at once.

I was horrified about how easy we can do mistakes. I gave up many features and start to easy some objects up. Meaning:

  1. Avoid mixing so much pages and intelligent and communicative user controls (the ones with lot of functionalities which actually most exist for each page that uses this control).

  2. Stop engendering universal functionalities on base classes. Sometimes is preferable repeat. Inheritance is cost.

  3. On some complex functionality I put everything on the same function. YES, reaching 100 lines most. When I read this recommendation on .net performance guidance I did not believe it but it works. Call stacks is a problem, use class properties over local variables is a problem. Class level variables can be a hell…

  4. Stop using complex base classes, no base classes with more than 7 lines should exist. If you spread bigger classes on the entire framework, you'll have problem.

  5. I start use more static objects and functionalities. I saw application wich other guy designed. All dataaccess objects methods (insert, update, delete, selects) was static ones. The application with more concurrent users never reaches out more than 45MB.

  6. To save some projects, I like stead state pattern. I learned in the real world but the author Nygard also agree with me on his book: Release IT - Design and Deploy Production-Ready software. He calls such approach as steady state pattern. This patterns says we may need something to free up idle resources.

  7. You may want play with on machine config file. On the attribute memoryLimit you'll indicate the percentage of memory which could be reached before a process recycles.

  8. You may also want play with on machine config file. On this attribute, GC will dictate the machine behaviour (Workstation GC and Server GC). This option may dramatically changes memory consumption behaviour too.

I had lot of success when I started to care about this items. Hope this help.

-- EDITED IN 04-05-2014 I've changed my mind about many things due to GC new versions improvements and the advances of HTML 5 and MVC framework.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM