简体   繁体   中英

How to get json api that doesnt read every time the best way

I have this function in php which tries to read a json api and it can take up to 10 attempts to read it and I want to send that response to view, but the function takes to long to run and will often not work, how can I make the code run better

Edit Changed the question

Its basically this now, it kinda works but I would like to have a better alternative.

$getjson = null;

for($i =0; i < 10; $i++){
try{
 $getjson = json_decode(file_get_contents('url'));
 if(array_key_exists('index', $getjson)) break;
  }catch(\Exception $e){}
}

return $getjson;

You say your code runs, but returns null if it does not read anything. The below code will avoid that specific scenario.

$get = null;

while($get == null || $get === false) {
    $get = file_get_contents('url');
}

return json_decode($get);

NOTE: This code can technically be an endless loop, assuming it constantly evaluates this way. It would be very advisable to have some kind of check to ensure that the script isn't killed by the web server prematurely, and that it does not continue running forever. I would have a set limit for how many times you're actually supposed to keep trying to fetch, and run that as a for-loop instead. Then make sure to have a fallback if all iterations still failed.

Something like this would be more advisable:

$get = null;
$limit = 100;
$i = 0;

do {
    $get = file_get_contents('url');
    $i++;

} while(($get == null || $get === false) && $i < $limit);

return json_decode($get);

Finally, if you don't like do-while, you can do it with a single for as well.

$get = null;

for($i = 0; ($get == null || $get === false) && $i < $limit; $i++) {
    $get = file_get_contents('url');
}

return json_decode($get);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM