简体   繁体   中英

How to stream and decrypt large files in S3 with GPG using Ruby

I'm working with very large files in S3 that need to be decrypted and saved back in S3. Our disk space and memory are limited and so streaming is our best option.

I'm using the ruby-gpgme gem.

Here's what I have so far:

require 'gpgme'
require 'aws-sdk'

s3_credentials = Aws::Credentials.new(AWS_ACCESS_KEY_ID, AWS_SECRET_KEY_ID)
s3_client = Aws::S3::Client.new(credentials: s3_credentials)

source_file = 'in_file.gpg'
object_key = "upload/#{source_file}"
destination = File.open('out.txt', 'w+b')

crypto = GPGME::Crypto.new
source = File.open(source_file)

s3_client.get_object(bucket: bucket, key: object_key) do |chunk|
  crypto.decrypt(chunk, password: password, output: destination)
end

# As a follow-up, I will setup the script to stream directly back out to S3
# S3.get_object > GPG.decrypt > S3.upload_part

This successfully decrypts and writes the first chunk, but decryption fails before processing the next chunk.

I'm assuming that is because the first chunk isn't properly terminated and .decrypt isn't continuously reading from a stream itself.

Any ideas on how to pass the chunks as a stream to decrypt?

Here is another approach you can look at maybe :

https://speakerdeck.com/julik/streaming-large-files-with-ruby

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM