简体   繁体   English

如何将大于5GB的文件上传到Amazon S3?

[英]How can I upload files larger than 5GB to Amazon S3?

I'm currently using Rails 3.2 with the Carrierwave gem to upload files to Amazon S3. 我目前正在使用带有Carrierwave gem的Rails 3.2将文件上传到Amazon S3。 Now I need to be able to handle user-submitted files larger than 5GB, while still using the Carrierwave gem. 现在我需要能够处理大于5GB的用户提交的文件,同时仍然使用Carrierwave gem。 Are there any other gems or branches of Carrierwave or Fog that can handle the 5GB+ file uploads to S3? Carrierwave或Fog还有其他宝石或分支可以处理上传到S3的5GB +文件吗?

Edit: I'd prefer not to have to rewrite a complete Rails uploading solution, so links like this won't help: https://gist.github.com/908875 . 编辑:我不想重写完整的Rails上传解决方案,所以这样的链接无济于事: https//gist.github.com/908875

You want to use S3's multipart upload functionality . 您想使用S3的分段上传功能 Helpfully, Fog can indeed handle multipart S3 uploads, as you can see in this pull request . 有用的是,Fog确实可以处理多部分S3上传,正如您在此拉取请求中看到的那样

Unfortunately, Carrierwave does not seem to have the functionality built in to use it correctly. 不幸的是,Carrierwave似乎没有内置的功能来正确使用它。 So you'd need to either modify Carrierwave or drop into Fog manually to correctly upload this file. 因此,您需要手动修改Carrierwave或者进入Fog以正确上传此文件。

I figured out how to do this and have it working now. 我想出了如何做到这一点,并让它现在工作。 In the proper config/environment file, add the following to send files in 100MB chunks to Amazon S3: 在正确的config/environment文件中,添加以下内容以将100MB块中的文件发送到Amazon S3:

CarrierWave.configure do |config|
  config.fog_attributes = { :multipart_chunk_size => 104857600 }
end

Since the fog gem has multipart uploads built in (thanks to Veraticus for pointing it out), the appropriate configuration attributes just need to be passed into fog via Carrierwave. 由于雾宝石内置了多部分上传(感谢Veraticus指出它),因此需要通过Carrierwave将适当的配置属性传递到雾中。 When sending to S3 I received frequent Connection reset by peer (Errno::ECONNRESET) errors, so parts of the upload may have to be retried. 当发送到S3时,我Connection reset by peer (Errno::ECONNRESET)错误接收到频繁的Connection reset by peer (Errno::ECONNRESET) ,因此可能必须重试部分上载。

You will need to break your file into small pieces prior to uploading. 在上传之前,您需要将文件分成小块。

Take a look at the following: 看看以下内容:

http://www.ruby-forum.com/topic/1282369 http://www.ruby-forum.com/topic/1282369

http://joemiller.me/2011/02/18/client-support-for-amazon-s3-multipart-uploads-files-5gb/ http://joemiller.me/2011/02/18/client-support-for-amazon-s3-multipart-uploads-files-5gb/

Either way, you need to split the file. 无论哪种方式,您都需要拆分文件。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在 Rails 应用程序中上传大文件 (5GB >) - How to Upload Large files in Rails Application (5GB >) 如何使用后端预签名URL使用其SDK将文件上传到Amazon S3? - How can i use a backend pre-signed url to upload files to Amazon S3 using its SDK? 如何使用Ruby on Rails 2.3将文件上传到Amazon S3? - How to upload files to Amazon S3 using Ruby on Rails 2.3? 如何使文件上传保持您原来的“内容类型”到 Amazon S3? - How can I to make upload of file maintain your original `Content Type` to Amazon S3? 上传到亚马逊S3 - upload to amazon S3 Rails:如何编辑存储在Amazon S3上的文本文件? - Rails: How can I edit text files stored on Amazon S3? Rails通过载波和雾将文件上传到Amazon S3 - Rails upload files to Amazon S3 with carrierwave and fog SignatureDoesNotMatch-Amazon s3无法上传图像 - SignatureDoesNotMatch - Amazon s3 can't upload image Rails + Amazon S3 + Heroku:S3上文件的URL是公开的,如何保护,管理员如何将文件添加到用户文件夹? - Rails + Amazon S3 + Heroku: url to files on S3 are public how to secure, and how can admin add files to users folders? CarrierWave可以上传到Amazon S3但是通过CloudFront服务吗? - Can CarrierWave upload to Amazon S3 but serve through CloudFront?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM