简体   繁体   中英

Downloading big file to google cloud storage using python

I'm writing a python GAE program. What I want to do is to download a file using HTTP protocol and write it into Google cloud storage.

But if the file that I'm requesting is big (bigger than 1Mb),

url = urllib2.urlopen(link)

won't work.

It always throws a ResponseTooLargeError

This post explains why this error is shown:

https://groups.google.com/forum/?fromgroups=#!topic/google-appengine/QEm-19vdcU4

However it doesn't solve my problem, which is to download a big file into cloud storage.

Can anyone help me? Thanks!

The limit to each response for a URL Fetch is 32MB. See the quotas and limits section of the Python GAE SDK. You won't be able to copy anything from a URL in one request that is greater than that size with App Engine. Now you could upload something larger directly from your local filestore into GAE using GCS or the Blobstore, but not by copying it from a URL.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM