简体   繁体   中英

Download and SAVE MANY Images from google Image search to a LOCAL FOLDER (Python)

The following code can download from a website one image per time. However, what I really want to achieve is to download MANY Images from a website based on a query and then save these images into a LOCAL FOLDER on my computer.

I am a complete beginner to programming and python. How can I achieve this?

import urllib.request
file = "Facts.jpg" # file to be written to
url = "http://www.compassion.com/Images/Hunger-Facts.jpg"
response = urllib.request.urlopen (url)
fh = open(file, "wb") #open the file for writing
fh.write(response.read()) # read from request while writing to file

You could define a function and use that function to repeat the task for each image url that you would like to write to disk:

def image_request(url, file):
    response = urllib.request.urlopen(url)
    fh = open(file, "wb") #open the file for writing
    fh.write(response.read()) #

For example, if you had a list with urls you could loop over the list:

for i, url in enumerate(urllist):
    image_request(url, str(i) + ".jpg")

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM