简体   繁体   中英

AWS_S3 ruby gem Timeout Error execution expired

I am processing some video with ffmpeg and then firing the video up to S3 with the aws_s3 gem . I use the following code:

S3Object.store("testme.mp4", open(file), 'blah', :access => :public_read)  

Everything works great but with files of 1GB and over I receive the following error:

"Timeout::Error: execution expired".  

This only happens after ffmpeg has processed the file however. If I send the file on its own, without processing, then everything is fine.

Has anyone come across a similar issue?

Thanks,

SLothistype

I have run into this problem, and unfortunately had to monkey patch the AWS::S3::Connection::create_connection method, so I could increase the read_timeout.

If you implement the method yourself, you would set

http.read_timeout = 300 # or something else higher

I originally found this from Pivotal Labs, Inc. They are pretty well respected and basically were saying "this is not a great solution, but the aws_s3 gem doesn't expose anything better."

The modern way to set the :http_read_timeout is while initializing Aws::S3::Client .

# https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/S3/Client.html#initialize-instance_method
s3_client = Aws::S3::Client.new(
  region: region_name,
  credentials: credentials,
  http_read_timeout: 300,
)

# Instead of open(file) in code from the OP's code
File.open('/source/file/path', 'rb') do |file|
  # https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/S3/Client.html#put_object-instance_method
  s3_client.put_object(bucket: 'bucket-name', key: 'testme.mp4', body: file)
end

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM