简体   繁体   中英

django python cumulus - How to deal with uploading a large number of files to cloud file storage

I have a number of files processed and saved in temp folder on my server and I now want to move them into my default_storage location, (default_storage is set to rackspace cloud files using django-cumulus).

The process begins uploading the files correctly but only manages less then half the files before stopping. My guess is its a memory issue, but I am not sure how to go about solving it. Here is the relevant code:

listing = os.listdir(path + '/images')
listing.sort()

for infile in listing:
    image = open(path + '/images/' + infile, 'r')
    image_loc = default_storage.save(infile, ContentFile(image.read()))

    image.flush()
    image.close()

Just in case it makes a difference my server setup is a rackspace cloud nginx and gunicorn on ubuntu

You could give django-storages a try. It sa custom backend which is easy to integrate, which also supports rackspace .

In the end the answer came in several parts. First I had to add a TIMEOUT setting to cumulus (which is not mentioned in the django-cumulus documentation). Second I increased timeout for gunicorn. Finally I increased the timeout parameter of nginx.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM