简体   繁体   中英

How to make jQuery to continously request the same HTTP GET request

I am implementing a software that reads system status with jQuery by using $.getJSON .

The idea is that I am displaying real-time data in browser and want to update the data as often as possible. The server is also running on localhost so theoretically I should be able to get very low latency (at least < 50 ms, even less than 20).

But question is this: how can I implement getJSON with jQuery so that the HTTP GET request will be automatically resend straight after the earlier request was finished?

In my current implementation I have defined an update function:

function updateSimulation() {       
   $.getJSON('/simulation/', function(data) {
      // ..parse data here
   });    
}

And added a callback once every 50 ms:

document.simulationIntervalId = setInterval( "updateSimulation()", 50 );

The problem is that if a request takes longer than 50 ms, several requests will be waiting for a result asynchronously, meaning that if one request takes longer, older data maybe replaced with newer data, which is not desirable.

So basically it means one of the following two solutions could be consider: 1) either when a response arrives, all pending requests could be killed (not optimal), 2) a new request is only sent after the previous one has returned (perhaps optimal solution).

But as always, all comments and answers are much appreciated!

Why don't you call updateSimulation on the success function (the second parameter of $.getJSON) ?

It will do the request after the previous one has finished.

Firstly, when using setInterval() or setTimeout() , don't pass the method in as a string, as this calls the infamous eval() which is poor coding practice. Rather pass the method like this: setInterval(updateSimulation, 50);

To make sure the ajax request has completed before another one is sent off, use setTimeout() to only call the updateSimulation() method once, and then on the jquery's ajax success event, call updateSimulation() again.

Parse and use data first, then issue a deferred call

Use window.setTimeout() instead and reissue another one after you've done processing the current one. This way there will be no overlapping .

function updateSimulation() {
   $.getJSON('/simulation/', function(data) {
      // ..parse data here
      window.setTimeout(updateSimulation, 250);
   });
}

Adjust timing as you wish...

use setTimeout instead, and call it on return of precedent call. It will run 50 ms after return (btw, 50 ms is far too small for AJAX, think about 500 ms or 1000 )

You can also add timestamp to sent data, and compare it against last known timeout

function updateSimulation() {               
   $.getJSON('/simulation/', function(data) {
         window.setTimeout(updateSimulation, 500 );

         if (data.sent > last_data_sent){
           // use data
           last_data_sent = data.sent

         }

   });              
}

You also could use the setInterval with the period you want (50ms for example) and add an indicator to know if the previous request has finished. This indicator doesn't have to be in the gobal array thought.

function updateSimulation() {
   if (!requesting) {
      requesting = true;
      $.getJSON('/simulation/', function(data) {
         requesting = false;
      });
   }
}

var requesting = false;
setInterval ('updateSimulation', 50);

Where namespace is an object you use.

2) sounds good, but 50ms is a little bit quick. I don't think that the Server will like it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM