简体   繁体   中英

Catching errors when downloading massive files via PHP

I am attempting to download fairly large files (up to, possibly over 1GB) from a remote HTTP server through a PHP script. I am using fgets() to read the remote file line by line and write the file contents into a local file that is created through tempnam(). However, the downloads of very large files (several hundred MB) are failing. Is there any way I can rework the script to catch the errors that are occurring?

Because the download is only part of a larger overall process, I would like to be able to handle the downloads and deal with errors in the PHP script rather than having to go to wget or some other process.

This is the script I am using now:

$tempfile = fopen($inFilename, 'w');
$handle = @fopen("https://" . $server . ".domain.com/file/path.pl?keyID=" . $keyID . "&format=" . $format . "&zipped=true", "r");
$firstline = '';
if ($handle) {
 while (!feof($handle)) {
  $buffer = fgets($handle, 4096);
  if ($firstline == '') $firstline = $buffer;
  fwrite($tempfile, $buffer);
 }
 fclose($handle);
 fclose($tempfile);
 return $firstline;
} else {
 throw new Exception ('Unable to open remote file.');
}

我想说的是您正在寻找stream_notification_callback (尤其是STREAM_NOTIFY_FAILURESTREAM_NOTIFY_COMPLETED常量)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM