简体   繁体   中英

best way to count how many times per second the method is called

I have dll method that should be "QoSed" - this method should be called maximum 100 times per second.:

    private static extern int ExecTrans(int connectionId);

This method used only in one place in program so it's ok to qos this place. I need separate "qos counter" for each connectionId . So ExecTrans(1) and ExecTrans(2) should go to different counters.

At first iteration I would like to count how often method is called (for each connectionId ). Ie I want to have "live statistics". There are two approach:

- allow to exceed limitiation for a short period. for example allow "100 transaction from 0 to 1 second, 100 transaction from 1 to 2 seconds and 200 transactions from 0.5 to 1.5 second".
- at any second interval transactions should not exceed 100.

For now I don't care which of these methods to use but I would select one creating less "overhead". I want qos to add as less "extra-work" as possible because it's trading software sensitive to every 0.1 ms.

As for first approach I think I can use something like that (pseude-code, probably stats and curStats should be made thread-safe):

private int[] stats      // statistic to display to user
private int[] curStats;  // statistic currently collection

OnOneSecondElapsed(object source, ElapsedEventArgs args) {
    foreach (conId : connIds) {
        stats[conId] = curStats[conId];
        curStats[conId] = 0;
    }
}

myMethod {
    ......
    ExecTrans(conId);
    ++curStats[conId];
    ......
}

As for second approach.... is it possible to make a collection where objects life for exactly one second and after a second disappear? Then every time I will add the next object to a collection unless collection contains 100 objects.

What do you think? I'm not familar with C# library files so probably I'm missing some useful classess, probably you can suggest another approach.

A first approach:

  • Use a ConcurrentQueue<DateTime>
  • Before every request, check the size of the queue. If > 100, cancel the request
  • If < 100, enqueue the current DateTime and execute the request
  • In a background thread, every 0.1 second, remove entries older than 1 second

It should be fairly efficient, but:

  • Since there's no locking between the time you check the queue count and the time you enqueue, you may sometimes get slightly over 100 requests per second
  • Since the background thread executes every 0.1 second, if you receive 100 requests at the same time, it may block the queue for up to 1.1 seconds. Adjust the sleep time as needed.

I may be wrong but I don't think there's a perfect solution. Basically, the more precise the system is, the more overhead there is. You have to adjust the parameters depending on your needs.

There is a tool called a profiler that does exactly what you are looking for. You run it with your code and it will tell you exactly how much time it spent in each method and how many times each method was called. Here is an old thread about C# profilers. If you're a professional developer you may have already have a company license to a profiler.

In case someone needs to measure rather than throttling... here's a naive approach which gives you a rough estimate:

class A{
    private int _calls;
    private Stopwatch _sw;

    public A(){
       _calls = 0;
       _sw = new Stopwatch();
       _sw.Start();
    }

    public void MethodToMeasure(){
        //Do stuff
        _calls++;
        if(sw.ElapsedMilliseconds > 1000){
            _sw.Stop();
            //Save or print _calls here before it's zeroed
            _calls = 0;
            _sw.Restart();
        }
    }
}

You will get called more than n times per second in some cases, I'm assuming you just don't want to do any actual processing in the extra cases.

You can use a synchronized Q object to hold transactions to be processed for each connection. Calling your method would just enqueue the data saying what should be done. In a seperate processing thread, (either just one for the system or per connection) you can then dequeue the operations and process them at a rate of 1 per .01 seconds. Just truncate the Q size to 100 (pop down to 100) before enqueing each unit of work for a given connection and voila, you discard extra work items.

Note: You'll need a precise timing function to enforce 1 transaction per .01 seconds. eg:

Stopwatch watch = new Stopwatch();
int nextPause = watch.Elapsed.Milliseconds + 10;
while (true)
{
    //do work (dequeue one item and process it)

    int now = watch.Elapsed.Milliseconds;
    if( now < nextPause ) {
        Thread.Sleep( nextPause - now );
    }
    nextPause = watch.Elapsed.Milliseconds + 10;
}

Note: If a transaction takes longer than 10 milliseconds (1/100th of a second), you may drop extra work items...

If you want the worker thread to be more "bursty" you could process multiple work items in one loop and use a longer wait time, which would require partial waits with a partial "items left" count. (also, it'd be better to use Monitor.Pulse and Montior.Wait instead of sleep...)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM