简体   繁体   中英

How do I run a function with precise timing?

I need a function to run at a precise time within +/- 1ms. I have tried the following but end up with a 15ms minimum time between execution.

void Main()
{
    System.Timers.Timer timer = new System.Timers.Timer(1);   // executes every 15ms
    timer.Elapsed += new System.Timers.ElapsedEventHandler(myFunction);

    System.Timers.Timer timer2 = new System.Timers.Timer(5);   // executes every 15ms
    timer2.Elapsed += new System.Timers.ElapsedEventHandler(myFunction);

    System.Timers.Timer timer2 = new System.Timers.Timer(20);   // executes every 31ms
    timer3.Elapsed += new System.Timers.ElapsedEventHandler(myFunction);

    timer.Start();
    timer2.Start();
    timer3.Start();

}

void myFunction()
{
    doWord();
}

using Thread.Sleep() obtains the same results.

Synopsis of application.

I will be reading in a file that contains 1553 messages (each with a timestamp). I will need to replay these messages with as close as possible timing that the file contains. The timestamps for the messages are recorded in microsec, but I will only need msec accuracy.

This is done using a DDC 1553 card (PCI Card). I have an analyzer which allows me to view the messages including the delta time between messages to measure my accuracy.

The machine I'm using has a QuadCore with hyperthreading. Using a for(int i=0; .....) I can get accuracy to with .5msec. However this is very inefficient and would prefer to use a more realistic and even more portable method if possible.

.NET, C#, and even Windows in general are not realtime OSes suitable for very fine timing.

The worst options include using the Timer classes and Thread.Sleep() .

You can measure timing fairly accurately using the Stopwatch class, but in terms of accurately waiting for a set amount of time to pass.. there's no built-in mechanism.

If you could outline exactly what you are trying to do, assuming it's not motion control, hardware interfacing etc, there is probably a better solution than relying on very accurate timers.

Update; Neal: if you are interfacing with hardware in a timing-sensitive way, you should use a different solution. You can do a tight loop with Stopwatch , but it will use lots of CPU for as long as you do. And it won't be accurate enough, probably. Eg: a PIC chip, an FPGA, an I/O card or interface, anything else basically.

You can use the high resolution timer but it is device depedant. You'll have to query for it. See this MSDN page for explanation: http://msdn.microsoft.com/en-us/library/aa964692%28v=vs.80%29.aspx

But System.Diagnostics.Stopwatch should aready give you precision near 1ms.

You could use System.Threading.Timer which has reasonable accuracy. Just keep in mind that it doesn't post onto the UI thread, so you'll need to delegate any UI interaction properly.

You can also use the multimedia timer to do this, which has very high resolution timing. See http://www.codeproject.com/KB/miscctrl/lescsmultimediatimer.aspx

I have no idea how to do something like this in a modern version of Windows - just stumbled across this old question and recalled I faced a similar problem an even longer time ago...

Back in the stone age (Win 3.11 and later Win95), I was able to obtain very high and repeatable real time performance (10kHz was no problem, jitter was pretty decent as well - in the microsecond range on a 90MHz Pentium) by reprogramming the real-time interrupt and hooking into the timer's Non Maskable Interrupt. At the time this involved a VxD (virtual device driver) to be able to directly access the timer. Also needed to obtain a dedicated shared memory space and code in assembly (could probably mix assembly with C/C++ - obviously, the higher the frequency, the tighter the loop needs to be).

Basically I reduced the timer's period to the one I desired, then executed my code - it would periodically call back into the OS so that the OS would experience the interval it was expecting. Also needed to hook the functions that the OS used to adjust the interval and adjust my callbacks accordingly (assuming my code was always running at a higher frequency than the OS wanted). Actually used it to do motion control via the printer port. Never made it into released software but did get a basic bench CNC going.

The code stopped working well by Win98 and I never tried my hand at it again. Accessing the hardware has gotten more complicated and almost always "virtualized" in some way.

I would start by looking at device driver programming and possibly games (eg Direct X) when trying to obtain some type of RTOS performance outside a dedicated RTOS environment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM