简体   繁体   中英

I need to parse huge xml array with min 300 elements , but it slow downs , i need advice

I am trying with this code to parse xml elements of an array. With small amount of elements it works fine, but when elements are more than 50, its appears a problem. it slows down very quiclky and even the process get stack.

$url=array('{250 domain names}')
foreach($url as $url){
    $xml = simplexml_load_file('http://data.alexa.com/data?cli=10&dat=snbamz&url='.$url);
    $rank=isset($xml->SD[1]->POPULARITY)?$xml->SD[1]->POPULARITY->attributes()->TEXT:0;
    $web=(string)$xml->SD[0]->attributes()->HOST;
    echo "$web ---> $rank<br>";
}

i need at least 250 domains to put on php array, but output gets stack. is there some other method of codding so the output appears with no slows? thanx in advance.

As Barmar noted you're doing an excessive amount of http requests back to back. Possibly look into batching curl requests.

https://www.php.net/manual/en/function.curl-multi-init.php

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM