简体   繁体   English

绕过PHP CURL 2GB下载限制

[英]Bypass PHP CURL 2GB download limit

I am trying to download a 7GB file using php-curl however it seems that it will only download the first 2GB. 我试图使用php-curl下载7GB文件,但似乎它只会下载第一个2GB。

There doesn't seem to be any documentation or talk about this. 似乎没有任何文件或谈论这个。

Anyone have any ideas? 有人有想法么?

Here are two useful links on the topic: 以下是有关该主题的两个有用链接:

Downloading a large file using curl 使用curl下载大文件

How to partially download a remote file with cURL? 如何使用cURL部分下载远程文件?

Basically you may have two problems here: 基本上你可能有两个问题:

  • You are reading into memory first as as such exhausting PHPs memory allocation 您正在读取内存,因为这样耗尽了PHP内存分配
  • You need to chunk download the file to overcome certain restrictions in the HTTP protocol. 您需要下载文件以克服HTTP协议中的某些限制。

There are also file sytem limitations and what not so check your file system type as mentioned by @ajreal (ie FAT32 has a limit of 4GB, 99% chance your not using FAT but still it is an example). 还有文件系统限制,而不是检查你的文件系统类型,如@ajreal所提到的(即FAT32有4GB的限制,99%的机会你没有使用FAT,但它仍然是一个例子)。

As the OP found out it was do with the DB: 正如OP发现它与数据库有关:

Turns out it was a database issue. 原来这是一个数据库问题。 File sizes were stored in a mysql database, the sizes were in bytes and max size for "int" column is 2147483648. Changing the column type to "bigint" fixed the issue. 文件大小存储在mysql数据库中,大小以字节为单位,“int”列的最大大小为2147483648.将列类型更改为“bigint”修复了问题。

Assuming your file system can handle files larger than 2GB you can try using copy 假设您的文件系统可以处理大于2GB的文件,您可以尝试使用copy

copy("http:://example.org/your_file","/tmp/your_file");

Also make sure you set an appropriate time limit (with set_time_limit(...) ). 还要确保设置适当的时间限制(使用set_time_limit(...) )。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM