简体   繁体   中英

How can I upload files larger than 5GB to Amazon S3?

I'm currently using Rails 3.2 with the Carrierwave gem to upload files to Amazon S3. Now I need to be able to handle user-submitted files larger than 5GB, while still using the Carrierwave gem. Are there any other gems or branches of Carrierwave or Fog that can handle the 5GB+ file uploads to S3?

Edit: I'd prefer not to have to rewrite a complete Rails uploading solution, so links like this won't help: https://gist.github.com/908875 .

You want to use S3's multipart upload functionality . Helpfully, Fog can indeed handle multipart S3 uploads, as you can see in this pull request .

Unfortunately, Carrierwave does not seem to have the functionality built in to use it correctly. So you'd need to either modify Carrierwave or drop into Fog manually to correctly upload this file.

I figured out how to do this and have it working now. In the proper config/environment file, add the following to send files in 100MB chunks to Amazon S3:

CarrierWave.configure do |config|
  config.fog_attributes = { :multipart_chunk_size => 104857600 }
end

Since the fog gem has multipart uploads built in (thanks to Veraticus for pointing it out), the appropriate configuration attributes just need to be passed into fog via Carrierwave. When sending to S3 I received frequent Connection reset by peer (Errno::ECONNRESET) errors, so parts of the upload may have to be retried.

You will need to break your file into small pieces prior to uploading.

Take a look at the following:

http://www.ruby-forum.com/topic/1282369

http://joemiller.me/2011/02/18/client-support-for-amazon-s3-multipart-uploads-files-5gb/

Either way, you need to split the file.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM