简体   繁体   中英

Why the setInterval in javascript does not work properly

<html>
<head>
<script>

window.onload = function(){
    setTimeout(function(){

        alert("hi");    
    }, 4000);

    var time = 0;
    var para = document.getElementsByTagName("p");
    setInterval(function(){

        time += 1;

        para[0].innerHTML = time;

    },1);

}
</script>
</head>
<body>
<p></p>
</body>
</html>

This is my code, I would like to calculate the amount of time when the alert is displayed, I expect it to be 4000, but it only become 9xx after 4 second. Why would this happen?

setInterval isn't very precise. The only guarantee it makes is that the time between calls is greater than or equal to the interval passed. In your case, one of the major issues is that the minimum interval is actually specified to be four milliseconds.

You can get a perfectly accurate measurement using new Date().getTime() (or Date.now() , given support) – this is in milliseconds, and is generally what you should use when doing anything relating to measuring time (or even most animation, just because it's simpler).

To expand on what I said in the comment, this should work:

setTimeout(function(){
    alert("hi");    
}, 4000);

var startTime = new Date();
var para = document.getElementsByTagName("p");
setInterval(function(){
    var timeElapsed = new Date() - startTime;
    para[0].innerHTML = timeElapsed;
},1);

Now even if you don't get in on every millisecond, but, say, every 28ms, your time display stays correct.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM