简体   繁体   中英

Calculating Averages with Performance Counters

I have a service process, and I want to use performance counters to publish the average time that it takes to complete tasks. I am using the AverageTimer32 counter to do this.

It's almost working the way I want, but not quite: When I increment the counter, it will briefly bump up to the value that I expect (watching in Performance Monitor), but then it drops right back down to zero.

So, the counter is zero, I run a task, the task completes, the counter briefly bumps up (to the correct value), but then it almost immediately falls back to zero.

I am using the AverageTimer32 counter with an AverageBase as the denominator. I increment the AverageBase by 1 every time I start a task, and then I increment the AverageTimer32 by the number of ticks to complete every time I finish the task. Can anyone give me a push?

It turns out that the reason that I could not do what I wanted was that none of the performance counter types provide for automatically calculating a running average. (the "average" counters, calculate an average based upon that moment in time, like "bytes per second").

I wanted a running average. So, I used the RawFraction performance counter type.

There was one problem with that method: The formula divides the result by 100 to produce a percentage, and I wanted a raw number (average operations completed per second).

So, I incremented the denominator of the fraction by 100 for every 1 operation (offsetting the percentage calculation).

My result: I can now display how long it takes, on average, for my service to complete a task. If my service isn't busy, the average remains constant so that you can see the long-term trend of my service's performance.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM