简体   繁体   中英

Apache/PHP large file download (>2Gb) failing

I'm using a PHP script to control access to download files. This works fine for anything under 2Gb but fails for larger files.

  • Apache and PHP are both 64bit
  • Apache will allow the file to be downloaded if accessed directly (which I can't allow)

The guts of the PHP (ignoring the access control):

if (ob_get_level())  ob_end_clean();

error_log('FILETEST: '.$path.' : '.filesize($path));
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($path));
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($path));
readfile($path);
exit;

The error log shows the file size fine

[Tue Apr 08 11:01:16 2014] [error] [client *.*.*.*] FILETEST: /downloads/file.name : 2251373807, referer: http://myurl/files/

But the access log has a negative size:

 *.*.*.* - - [08/Apr/2014:11:01:16 +0100] "GET /files/file.name HTTP/1.1" 200 -2043593489 "http://myurl/files/" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:24.0) Gecko/20100101 Firefox/24.0"

And so browsers refuse to download the file. In fact, using wget, it's not sending anything:

$ wget -S -O - http://myurl/files/file.name
--2014-04-08 11:33:38--  http://myurl/files/file.name
HTTP request sent, awaiting response... No data received.
Retrying.

Try to read the file in chunks and expose them to the browser instead of filling your local memory with 2GB and flushing all at once.

Replace readfile($path); by:

@ob_end_flush();
flush();

$fileDescriptor = fopen($file, 'rb');

while ($chunk = fread($fileDescriptor, 8192)) {
    echo $chunk;
    @ob_end_flush();
    flush();
}

fclose($fileDescriptor);
exit;

8192 bytes is a critical point in some cases, refere to php.net/fread .

Adding some microtime variables (and comparing with the pointer position of the file descriptor) will also allow you to controll the maximum speed of the download.

* (Flushing the output buffer also slightly depends on the webserver, use those commands to be sure it at least tries to flush as much as possible.)

I came across this issue before and used the below script for downloading files, it breaks the file into chunks to download large files instead of trying to take the whole file at once. This script also takes into account the browser being used as some browsers (namely IE) can handle headers slightly differently.

private function outputFile($file, $name, $mime_type='') {
    $fileChunkSize = 1024*30;

    if(!is_readable($file)) die('File not found or inaccessible!');

    $size = filesize($file);
    $name = rawurldecode($name);

    $known_mime_types=array(
        "pdf" => "application/pdf",
        "txt" => "text/plain",
        "html" => "text/html",
        "htm" => "text/html",
        "exe" => "application/octet-stream",
        "zip" => "application/zip",
        "doc" => "application/msword",
        "xls" => "application/vnd.ms-excel",
        "ppt" => "application/vnd.ms-powerpoint",
        "gif" => "image/gif",
        "png" => "image/png",
        "jpeg"=> "image/jpg",
        "jpg" =>  "image/jpg",
        "php" => "text/plain"
     );

     if($mime_type=='')
     {
         $file_extension = strtolower(substr(strrchr($file,"."),1));
         if(array_key_exists($file_extension, $known_mime_types))
            $mime_type=$known_mime_types[$file_extension];
         else
            $mime_type="application/force-download";
     }

     @ob_end_clean();

     if(ini_get('zlib.output_compression'))
      ini_set('zlib.output_compression', 'Off');

     header('Content-Type: ' . $mime_type);
     header('Content-Disposition: attachment; filename="'.$name.'"');
     header("Content-Transfer-Encoding: binary");
     header('Accept-Ranges: bytes');
     header("Cache-control: private");
     header('Pragma: private');
     header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");

     if(isset($_SERVER['HTTP_RANGE']))
     {
        list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2);
        list($range) = explode(",",$range,2);
        list($range, $range_end) = explode("-", $range);
        $range=intval($range);
        if(!$range_end)
            $range_end=$size-1;
        else
            $range_end=intval($range_end);

        $new_length = $range_end-$range+1;
        header("HTTP/1.1 206 Partial Content");
        header("Content-Length: $new_length");
        header("Content-Range: bytes $range-$range_end/$size");
     } 
     else 
     {
        $new_length=$size;
        header("Content-Length: ".$size);
     }

     $chunksize = 1*($fileChunkSize);
     $bytes_send = 0;
     if ($file = fopen($file, 'r'))
     {
        if(isset($_SERVER['HTTP_RANGE']))
        fseek($file, $range);

        while(!feof($file) && 
            (!connection_aborted()) && 
            ($bytes_send<$new_length)
        )
        {
            $buffer = fread($file, $chunksize);
            print($buffer);
            flush();
            $bytes_send += strlen($buffer);
        }
     fclose($file);
     } 
     else die('Error - can not open file.');

    die();
}

Add code before readfile($path);

ob_clean();
flush();

I use this code for download:

if (file_exists($file)) {


        header('Content-Description: File Transfer');
        header('Content-Type: application/octet-stream');
        header('Content-Disposition: attachment; filename='.basename($file));
        header('Content-Transfer-Encoding: binary');
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header('Pragma: public');
        header('Content-Length: ' . filesize($file));

        ob_clean();
        flush();
        readfile($file);
        exit;

    }

Your best choice is to force apache to http chunked mode with a function like this. You'll save a lot of PHP memory this way.

function readfile_chunked($filename, $retbytes = TRUE) {
  $CHUNK_SIZE=1024*1024;
  $buffer = '';
  $cnt =0;
  $handle = fopen($filename, 'rb');
  if ($handle === false) {
    return false;
  }
  while (!feof($handle)) {
    $buffer = fread($handle, $CHUNK_SIZE);
    echo $buffer;
    @ob_flush();
    flush();
    if ($retbytes) {
      $cnt += strlen($buffer);
    }
  }
  $status = fclose($handle);
  if ($retbytes && $status) {
    return $cnt; // return num. bytes delivered like readfile() does.
  }
  return $status;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM