简体   繁体   English

Perl:如何循环下载大型zip文件并提供反馈

[英]Perl: How to download a large zip file in a loop and provide feedback

Using Perl, how do I download a large zip file from a web server, and while it's downloading, post status messages? 使用Perl,如何从Web服务器下载较大的zip文件,并在下载时发布状态消息? I tried LWP::Simple 's getstore() and turned on $ua->show_progress(1) , but it hangs while it's downloading and I can't provide feedback over POSIX Named Pipes to another application. 我尝试了LWP :: Simplegetstore()并打开$ua->show_progress(1) ,但是在下载时它挂起了,我无法通过POSIX命名管道向其他应用程序提供反馈。

After the comments on this answer specified the problems more precisely then your question you might try below. 在此答案的评论中指定了问题后,您可以在下面尝试更精确地确定问题。 This forks a child with the download and in the master you will be able to read the progress information from a file descriptor. 这将使孩子分担下载的任务,在母版中,您将能够从文件描述符中读取进度信息。

Nevertheless, a much better interaction between the download and your program which handles the progress might be possible, because in my opinion forking a process just so that the download can somehow displayed is ugly. 尽管如此,下载和您的程序之间可能会有更好的交互来处理进度,因为我认为创建一个过程仅仅是为了使下载能够以某种方式显示是很丑陋的。 But this part of the download feedback depends mostly on the way you've designed your progress showing application, and this design is unknown too me. 但是下载反馈的这一部分主要取决于您设计进度显示应用程序的方式,这种设计我也不知道。

use strict;
use warnings;
use LWP::UserAgent;

my $url = "http://...";
my $file = "outputfile";

pipe my $rfh, my $wfh;
defined( my $pid = fork() ) or die "fork failed: $!";
if ($pid == 0) {
    # download in child, redirect progress to pipe
    close($rfh);
    $wfh->autoflush(1);
    open(STDERR,">&",$wfh) || die $!;
    close($wfh);

    my $ua = LWP::UserAgent->new;
    $ua->show_progress(1);
    $ua->get($url, ':content_file' => $file );
    exit;
}

# read in master from pipe
close($wfh);
$SIG{CHLD} = 'IGNORE';
while (sysread($rfh, my $buf, 8192,0)) {
    print "progress... $buf\n";
}
#!/usr/bin/perl
use strict;
use warnings;
use LWP::Simple qw($ua getstore);
use POSIX qw(WNOHANG);

my $sURL = "http://example.com/example.zip";
my $sSaveFile = "/tmp/download.zip";

# it's probably a good idea to first check the server with a HEAD request on the URL before wasting time on a download. 
# But, besides that, here you go...

my $sMessage = "Downloading";
my $pid = fork(); 
if ($pid == 0) {
  getstore($sURL,$sSaveFile);
  exit;
}
do {
    # here, I print, but you could also provide feedback via Named Pipes or some other mechanism
    # you might also want to do a byte check to see if the file size is increasing, and if not, increase a flag counter, and if you hit like 5 flags, give up on the download with a "last" statement.
    # remember, byte checks can easily be done with: my $nBytes = (-s $sSaveFile);
    print "$sMessage\n";
    $sMessage .= '.';
    sleep(2);
} while (waitpid($pid, WNOHANG)==0);
print "\n\nDONE\n\n";

I found I had to do it this way instead of shelling out and running curl because on OSX Lion, it has a bug with Curl timing out after 30 seconds when used in a piped process, while versions of OSX after that no longer had this bug. 我发现我必须这样做,而不是脱壳并运行curl,因为在OSX Lion上,它在管道处理中使用时有30分钟后Curl超时的错误,而此后的OSX版本不再存在此错误。 。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM