简体   繁体   English

在PHP中获取2GB的文件?

[英]Getting a 2GB file inside PHP?

I am needing to download a very large file via PHP, the last time I did it manually via http it was 2.2gb in size and took a few hours to download. 我需要通过PHP下载一个非常大的文件,上次我是通过http手动完成的,文件大小为2.2gb,花了几个小时才能下载。 I would like to automate the download somehow. 我想以某种方式自动化下载。

Previously I have used 以前我用过

file_put_contents($filename, file_get_contents($url));

Will this be ok for such a large file? 这样大的文件可以吗? I will want to untar the file post downloading and then perform analysis of the various files inside the tarball. 我将要解压缩下载后的文件,然后对tarball中的各种文件进行分析。

file_get_contents() is handy for small files but it's totally unsuitable for large files. file_get_contents()对于小文件很方便,但是完全不适合大文件。 Since it loads the entire file into memory you need like 2GB of RAM for each script instance! 由于它将整个文件加载到内存中,因此每个脚本实例需要2GB的RAM!

You should use resort to old fopen() + fread() instead. 您应该改用旧的fopen()+ fread()。

Also, don't discard using a third-party download tool like wget (installed by default in many Linux systems) and create a cron task to run it. 另外,请勿丢弃使用wget(在许多Linux系统中默认安装)之类的第三方下载工具,并创建cron任务来运行它。 It's possibly the best way to automate a daily download. 这可能是自动执行每日下载的最佳方法。

您将不得不修改php.ini以接受上载的较大文件,并修改您的内存使用限制。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM