简体   繁体   中英

Upload large files as a stream to s3 with Plain Javascript using AWS-SDK-JS

There is a pretty nice example available for uploading large files to s3 via aws-sdk-js library but unfortunately this is using nodeJs fs.

Is there a way we can achieve the same thing in Plain Javascript? Here is a nice Gist as well which breaks down the large file into the smaller Chunks however this is still missing the .pipe functionality of nodeJs fs which is required to pass to asw-sdk-js upload function. Here is a relevant code snippet as well in Node.

var fs = require('fs');
var zlib = require('zlib');

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
  on('httpUploadProgress', function(evt) {
    console.log('Progress:', evt.loaded, '/', evt.total); 
  }).
  send(function(err, data) { console.log(err, data) });

Is there something similar available in Plain JS (non nodeJs)? Useable with Rails.

Specifically, an alternative to the following line in Plain JS.

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());

The same link you provided contains an implementation intended for the Browser, and it also uses the AWS client SDK .

// Get our File object
var file = $('#file-chooser')[0].files[0];

// Upload the File
var bucket = new AWS.S3({params: {Bucket: 'myBucket'});
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.upload(params, function (err, data) {
  $('#results').html(err ? 'ERROR!' : 'UPLOADED.');
});

** EDITS **

Note the documentation for the Body field includes Blob , which means streaming will occur:

Body — (Buffer, Typed Array, Blob, String, ReadableStream)

You can also use the Event Emitter convention in the client offered by the AWS SDK's ManagedUpload interface if you care to monitor progress . Here is an example:

var managed = bucket.upload(params)
managed.on('httpUploadProgress', function (bytes) {
  console.log('progress', bytes.total)
})
managed.send(function (err, data) {
  $('#results').html(err ? 'ERROR!' : 'UPLOADED.');
})

If you want to read the file from your local system in chunks before you send to s3.uploadPart , you'll want to do something with Blob.slice , perhaps defining a Pipe Chain .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM