简体   繁体   中英

For-Loop XMLHttpRequest logs values out of order

I need this code to find all URL's from several webpages and list them in a certain order in console. I need the list to begin with URL's from fakeURL.com/0 and end with fakeURL.com/20, and stay in order all the way. The problem is that sometimes it will list URL's from (for example) fakeURL.com/5 before URL's from fakeURL.com/2.

It also needs to be in order within each webpage - URL's that are more near the top of a webpage should come first.

What's causing the list to be out of order, and how can I fix it?

var i;
function ajaxCall (x)
{
        var xhrs = new XMLHttpRequest();                                    
        xhrs.open("get", 'http://fakeURL.com/' + x, true);
        xhrs.onload = function()
        {       
            var doc = xhrs.response;
            $(doc).find('a').each(function()
            {    

                var url = $(this).attr('href');
                console.log(url);

            });

        }
        xhrs.responseType = 'document';
        xhrs.send();
}

for(i = 0; i <= 20; i++) 
{   
    ajaxCall(i);
}

The XMLHttpRequest by default is asyncronous. So, if you call ajaxCall() with 1,2,3,...20 (for your particular case) this doesn't guarantee you that the URL is printed (console.log) in the same sequence.

For more information, read this documentation from mozilla

The reason why you get the values out of order despite traversing incrementally in a for loop is because that's how an XMLHttpRequest works by default (ie asynchronously)

From the official documentation , in an Asynchronous HTTP Request, the elements don't freeze while the request happens in the background and once the resources are fetched, you can tap on them by using a callback function.

Synchronous requests block the execution of code which creates "freezing" on the screen and an unresponsive user experience.

So Sychronous requests might seem the way to go in your case, but they have performance implications and of course may account for a bad user experience.

A simple work around that I can suggest for your case which I understand is just listing the URLs, is just let the URLs get fetched in whatever way they want to. Store them in an array and add another attribute to them called page_id or something that can help you identify the order of your links. So the outermost links could look something like:

var a = {link: "http://fakeURL.com/1", page_id: 1};

//Store such objects in a list.

For internal links on one page as well, let them get parsed in whatever they want to, and just associate their indices with them using .index() in Jquery. From the documentation :

 //html 
<ul>
 <li id="foo">foo</li>
 <li id="bar">bar</li>
 <li id="baz">baz</li>
</ul>

//Javascript
var listItem = document.getElementById( "bar" );
alert( "Index: " + $( "li" ).index( listItem ) );

//Outputs - Index: 1

Now when you need to display the links, just sort the relevant lists with the page_id attribute or the depth attribute and display them accordingly. Hope it gets you started in the right direction.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM