简体   繁体   中英

download list of images from urls

I need to find (preferably) or build an app for a lot of images. Each image has a distinct URL. There are many thousands, so doing it manually is a huge effort. The list is currently in an csv file. (It is essentially a list of products, each with identifying info (name, brand, barcode, etc) and a link to a product image. I'd like to loop through the list, and download each image file. Ideally I'd like to rename each one - something like barcode.jpg. I've looked at a number of image scrapers, but haven't found one that works quite this way. Very appreciative of any leads to the right tool, or ideas...

Are you on Windows or Mac/Linux? In Windows you can use a powershell script for this, on mac/linux a shell script with about 1-5 lines of code.

Here's one way to do this:

# show what's inside the file
cat urlsofproducts.csv

http://bit.ly/noexist/obj101.jpg, screwdriver, blackndecker
http://bit.ly/noexist/obj102.jpg, screwdriver, acme

# this one-liner will GENERATE one download-command per item, but will not execute them
perl -MFile::Basename -F", " -anlE "say qq(wget -q \$F[0] -O '\$F[1]--\$F[2]--).  basename(\$F[0]) .q(')" urlsofproducts.csv 



# Output :
wget http://bit.ly/noexist/obj101.jpg -O ' screwdriver-- blackndecker--obj101.jpg'
wget http://bit.ly/noexist/obj101.jpg -O ' screwdriver-- acme--obj101.jpg'

Now back-substitute the wget commands into the shell.

If possible please use google sheets to run a function for this kind of work, I was also puzzled on this one and now found a way to by which the images are not only downloaded but those are renamed on the real time.

Kindly reply if you want the code.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM