简体   繁体   中英

Bulk download files from url

I need to download files in bulk each 0-2.5 MB from an Url to my server(Linux CentOS/can be any other too).

I would like to use the wget (if you have another solution then please post it):

My first approach is to test it with only 1 file:

wget -U --load-cookies=cookies.txt "url"

This is the Shell Response:

The Problem is that it doesn't download the file but only the empty html. The necessary cookie is saved in the right format in the file and the download works in the browser.

If it works to download the 1 file, I want to use a txt with all the urls (eg urls.txt) where the urls are like the above but only one parameter is changing. Then I want also that it downloads maybe 10-100 files at a time. If you have a solution in PHP or Python for this, it will help me too.

Thank you for your help!

I have solved it now with aria2. Its a great Tool for such Things.

Basically:

for i in foo bar 42 baz; do
    wget -other -options -here "http://blah/blah?param=$i" -O $i.txt
done

Note the -O parameter, which lets you set the output filename. foo.txt" is a little easier to use than data-output?format=blahblahblah`.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM