简体   繁体   English

使用Heroku上的Django将大文件上载到AWS S3 Bucket,没有30秒请求超时

[英]Uploading Large files to AWS S3 Bucket with Django on Heroku without 30s request timeout

I have a django app that allows users to upload videos. 我有一个django应用程序,允许用户上传视频。 Its hosted on Heroku and the uploaded files stored on an S3 Bucket. 它托管在Heroku上,以及存储在S3 Bucket上的上传文件。 I am using JavaScript to directly upload the files to S3 after obtaining a presigned request from Django app. 在从Django app获得预先签名的请求后,我使用JavaScript直接将文件上传到S3。 This is due to Heroku 30s request timeout. 这是由于Heroku 30s请求超时。 Is there anyway that i can possibly upload large files through Django backend without using JavaScript and compromising the user experience? 无论如何,我可以通过Django后端上传大文件而不使用JavaScript并破坏用户体验吗?

You should consider some of the points below for the solution of your problem. 您应该考虑以下几点来解决您的问题。

  • Why your files should not come to your django-server then go to s3: Sending files to django server then sending them to s3 is just a waste of computational power and bandwidth both. 为什么你的文件不应该到你的django-server然后转到s3:将文件发送到django服务器然后将它们发送到s3只是浪费计算能力和带宽。 Next con would be that why send files to django server when you can directly send them to your s3 storage. 接下来就是为什么当你可以直接将文件发送到s3存储器时将文件发送到django服务器。
  • How can you upload files to s3 without compromising UX: Sending files to django server is certainly not an option so you have to handle this on your frontend side. 如何在不影响UX的情况下将文件上传到s3:将文件发送到django服务器肯定不是一个选项,因此您必须在前端处理此问题。 But front end side has its own limitation like limited memory. 但前端有自己的限制,如有限的内存。 It won't be able to handle very large file because everything gets loaded into RAM and browser will eventually run out of memory if its a very large file. 它将无法处理非常大的文件,因为一切都被加载到RAM中,如果它是一个非常大的文件,浏览器最终会耗尽内存。 I would suggest that you use something like dropzone.js . 我建议你使用像dropzone.js这样的东西。 It won't solve the problem of memory but it certainly can provide good UX to the user like showing progress bars, number of files etc. 它不会解决内存问题,但它肯定可以为用户提供良好的用户体验,如显示进度条,文件数等。

The points in the other answer are valid. 另一个答案中的要点是有效的。 The short answer to the question of "Is there anyway that i can possibly upload large files through Django backend without using JavaScript" is "not without switching away from Heroku". 对“无论如何我可以通过Django后端而不使用JavaScript上传大文件”的问题的简短回答是“并非没有切断Heroku”。

Keep in mind that any data transmitted to your dynos goes through Heroku's routing mesh, which is what enforces the 30 second request limit to conserve its own finite resources. 请记住,传输到dynos的任何数据都要通过Heroku的路由网格,这就是强制执行30秒请求限制以保存自己的有限资源。 Long-running transactions of any kind use up bandwidth/compute/etc that could be used to serve other requests, so Heroku applies the limit to help keep things moving across the thousands of dynos. 任何类型的长时间运行事务都会占用可用于提供其他请求的带宽/计算/等,因此Heroku应用限制来帮助保持数千个动态的数据移动。 When uploading a file, you will first be constrained by client bandwidth to your server. 上传文件时,首先会受到服务器客户端带宽的限制。 Then, you will be constrained by the bandwidth between your dynos and S3, on top of any processing your dyno actually does. 然后,除了你的dyno实际处理的任何处理之外,你将受到你的dynos和S3之间的带宽的限制。

The larger the file, the more likely it will be that transmitting the data will exceed the 30 second timeout, particularly in step 1 for clients on unreliable networks. 文件越大,传输数据的可能性就越大,超过30秒超时,特别是在不可靠网络上的客户端的步骤1中。 Creating a direct path from client to S3 is a reasonable compromise. 创建从客户端到S3的直接路径是一种合理的折衷方案。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM