简体   繁体   中英

Odd behavior with setInterval() in node.js (windows only, works in linux)

I have been having some strange issues with node.js using windows machines which does not occur on linux machines.

When using the setInterval() function at low delay settings (< 100ms): Essentially, the provided function is called less often than I would expect, in a way that is very consistent. When using the example code I quickly wrote up below, windows machines seem to give consistently different results than linux machines, even when running on the same hardware.

//should be time in MS between executions
var delay = 10;

//how many seconds to run the test
var testPeriods = 10;

var counter = 0;
var startTime = new Date().getTime();

var interval = setInterval(() => {
    ++counter;
    if (new Date().getTime() >= startTime+1000*testPeriods) {
        console.log('Mean Function Calls per Second:', counter/testPeriods);
        clearInterval(interval);
    }
}, delay);

Some test data, from my pc which dual-boots linux and windows:

Delay (ms) | Expected | Linux result | Windows result
-----------+----------+--------------+---------------
       100 |       10 |         10.0 |            9.2
        50 |       20 |         20.0 |           16.0
        25 |       40 |         39.8 |           32.0
        10 |      100 |         98.4 |           63.8

You'll notice that the linux results match almost perfectly until the lower delay settings, and even they it isn't that far off. The windows results, on the other hand, fall well short.

I figure it's likely that the windows version of node is poorly optimized compared to the linux version. So at first, I assumed that the function I was providing simply took too long to execute, thereby delaying the next execution. However, this does not seem to be the case. After all, if I assume the provided function takes similar time to execute no matter what, then I know the windows machine can execute it up to ~63 times a second at the lowest delay setting. So why then does it only execute ~32 times when it should be doing it ~40 times (with delay @ 25ms)?

If anybody could give me some insight into why this is happening or what I am doing wrong, that would be much appreciated.

EDIT: Simplified code at the suggestion of @jfriend00, and updated test results to match.

How quickly a piece of scheduled code will be called depends on your OS'es timer interrupt interval. Depending on the OS this setting is known as the tick or jiffy . On some OS it is configurable.

OSes will periodically run its own code to do checks like making processes take turns using the CPU, cleaning up memory etc. To do this the OS sets up a timer interrupt (actually it's like setInterval only at hardware level) to run its own code.

This is actually how OSes manage to run more processes/threads than cores available and it is this mechanism that also drives setInterval and setTimeout in javascript at the OS level. So basically both threads and setInterval use the same OS mechanism, the only difference is that threads can use several cores at once while setInterval can only use the same core as the main js thread.

There is a trade-off to how often you set your tick . Setting the tick very short will make your OS more real-time as the lag in event processing is greatly reduced. However, this also means that your OS code uses a higher percentage of CPU time leaving less for your apps. Setting the tick longer will give your apps more CPU time thus giving you more throughput.

I'm actually a bit surprised by your results. Years ago (early 2000s) the default jiffy setting on Linux was quite long as Linux was optimized more for server uses and thus optimized for throughput. Windows on the other hand had a shorter tick as Windows was optimized more for real-time tasks such as playing games and running graphical apps. Perhaps things have changed over the years.

So yes, if you want cross-platform consistency there is a minimum setInterval time that would work across OSes. However I would have guessed that 10ms would be enough as that was my previous experience over the years (1ms would obviously show the different behavior of various OSes). If 10ms does not work these days you can try a longer interval such as 50ms‡ or 100ms.

‡ note: 50ms or 20fps is the update interval of radio control transmitters so it is real-time enough for human reflexes to fly planes, helicopters and drones without crashing

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM