简体   繁体   中英

How capturing the accurate execution time in c#

I try to capture the exact execution time of function

Stopwatch regularSW = new Stopwatch();
for (int i = 0; i < 10; i++) {
    regularSW.Start();
    //function();
    regularSW.Stop();

    Console.WriteLine("Measured time: " + regularSW.Elapsed);
}

I also tried with DateTime and Process.GetCurrentProcess().TotalProcessorTime but each time I get a different value.

How i can get same value ?

With StopWatch you already use the most accurate way. But you are not re-starting it in the loop. It always starts at the value where it ended. You either have to create a new StopWatch or call StopWatch.Restart instead of Start :

Stopwatch regularSW = new Stopwatch();
for (int i = 0; i < 10; i++) {
    regularSW.Restart();
    //function();
    regularSW.Stop();
    Console.WriteLine("Measured time: " + regularSW.Elapsed);
}

That's the reason for the different values. If you now still get different values, then the reason is that the method function really has different execution times which is not that unlikely(fe if it's a database query).


Since this question seems to be largely theoretical(regarding your comments), consider following things if you want to measure time in .NET:

  • compile and run in release mode, Any CPU (on an x64 machine) and optimizations on
  • A tick is 0.0001 milliseconds, so don't overestimate your results
  • They are different because you cannot control what other operations your system might need to perform in the background while your C# progam is running
  • If you for example claim memory in the method because you fill a local list, then the garbage collector might attempt to reclaim garbage(memory)
  • C# code is compiled Just In Time. The first time you go through a loop can therefore be hundreds or thousands of times more expensive than every subsequent time due to the cost of the jitter analyzing the code that the loop calls. If you are intending on measuring the "warm" cost of a loop then you need to run the loop once before you start timing it. If you are intending on measuring the average cost including the jit time then you need to decide how many times makes up a reasonable number of trials, so that the average works out correctly
  • you are running your code in a multithreaded, multiprocessor environment where threads can be switched at will, and where the thread quantum (the amount of time the operating system will give another thread until yours might get a chance to run again) is about 16 milliseconds. 16 milliseconds is about fifty million processor cycles. Coming up with accurate timings of sub-millisecond operations can be quite difficult if the thread switch happens within one of the several million processor cycles that you are trying to measure. Take that into consideration.

The last two points were copied from this answer of Eric Lippert (worth reading).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM