简体   繁体   中英

Download multiple files in parallel with wget

I want to check a URL list and if the file (.pdf) exists, download it. I use this command and it's working;

for i in `cat url.txt`; do if wget -q --method=HEAD $i; then wget $i; fi ; done

But this is really slow, I have a lot of URLs to check. Is there a way to "multithread" this command?

Use perl parallel

eg

for i in `cat url.txt`; do if wget -q --method=HEAD $i; then 
   echo "wget '$i'"; fi ; done \
   | parallel

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM