简体   繁体   中英

How do I increase windows interrupt latency to stress test a driver?

I have a driver & device that seem to misbehave when the user does any number of complex things (opening large word documents, opening lots of files at once, etc.) -- but does not reliably go wrong when any one thing is repeated. I believe it's because it does not handle high interrupt latency situations gracefully.

Is there a reliable way to increase interrupt latency on Windows XP to test this theory?

I'd prefer to write my test programn in python, but c++ & WinAPI is also fine...

My apologies for not having a concrete answer, but an idea to explore would be to use either c++ or cython to hook into the timer interrupt (the clock tick one) and waste time in there. This will effectively increase latency.

I don't know if there's an existing solution. But you may create your own one.

On Windows all the interrupts are prioritized. So that if there's a driver code running on a high IRQL, your driver won't be able to serve your interrupt if its level is lower. At least it won't be able to run on the same processor.

I'd do the following:

  1. Configure your driver to run on a single processor (don't remember how to do this, but such an option definitely exists).
  2. Add an I/O control code to your driver.
  3. In your driver's Dispatch routine do a busy wait on a high IRQL (more about this later)
  4. Call your driver (via DeviceIoControl ) to simulate a stress.

The busy wait may look something like this:

KIRQL oldIrql;
__int64 t1, t2;

KeRaiseIrql(31, &oldIrql);

KeQuerySystemTime((LARGE_INTEGER*) &t1);

while (1)
{
    KeQuerySystemTime((LARGE_INTEGER*) &t2);

    if (t1 - t1 > /* put the needed time interval */)
        break;
}

KeLowerIrql(oldIrql);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM