简体   繁体   中英

Timeout Issue file_get_contents?

Hey I've a problem with my code. It works fine for the first 10 name, but then "file_get_contents" return just empty strings

Is this a timeout problem? or has it a other reason?

And how can i fix this?

my Code:

<?php
$member;

mysql_connect("localhost","**********","********");
mysql_select_db('bf3_ezstats');
$sql = mysql_query('SELECT id, name FROM ez2bf3_player ORDER BY id ASC');

while($row = mysql_fetch_assoc($sql)){
$member[$row['id']] = $row['name'];
}
mysql_close();
print_r($member);

foreach ($member as $ip => $player){
ini_set('default_socket_timeout', 120);
$SC = file_get_contents('http://battlelog.battlefield.com/bf3/user/'.$player);

$SC = split('<surf:container id="profile-gamereport-previews">',$SC);
$SC = split('</surf:container>',$SC[1])[0];

$IPs = array(0=>$player);
while(strpos($SC,'href') !== false){
    $start = strpos($SC,"href");
    $end = strpos($SC,'"',$start+6);
    $IP= substr($SC,$start,$end-$start);
    $IPs[] = "http://battlelog.battlefield.com".str_replace('href="',"",$IP);
    $SC = substr($SC,$end,strlen($SC)-1);
}
print_r($IPs);

}

?>

file_get_contents() on external URIs is just a huge security issue. This method could lead to many errors, probably including yours.

If you need to work on external servers, through HTTP, I strongly recommand the use of cURL ( http://php.net/manual/fr/book.curl.php ). You'll find it more handy, I think, and you may save yourself a lot of trouble.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM