简体   繁体   中英

Upload Large files Django Storages to AWS S3

I am a new Django, and I have a project that should upload large files (sizes from 5G+).

I am using:

  • django-storages
  • Amazon S3
  • Django 3.0.2
  • Python 3.7.6
  • JQuery 3.1

In documentation, it says that when the file is larger than 2.5MB, it goes to TemporaryFileUploadHander, which means it goes to /tmp directory first and when the upload is complete, it moves the file to Media Root (which in my case it is the Amazon S3)

Now, the requirement is to stream the upload of file to Amazon S3. For example: 20Gb file should be uploaded in stream (little by little) to Amazon S3. How can I implement this requirement to upload the file directly to Amazon S3 directly without having it first in /tmp directory.

Please give me some light.

Streaming the file from the web client, to Django and then to S3 is not possible. Best solution is to split the files into chunks from client side JS and then upload them one by one. Sample code is here .

This solution will still save the smaller files to disk though.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM