[英]Upload large files as a stream to s3 with Plain Javascript using AWS-SDK-JS
There is a pretty nice example available for uploading large files to s3 via aws-sdk-js library but unfortunately this is using nodeJs fs.有一个非常好的示例可用于通过 aws-sdk-js 库将大文件上传到 s3,但不幸的是这是使用 nodeJs fs。
Is there a way we can achieve the same thing in Plain Javascript?有没有办法可以在普通 Javascript 中实现相同的功能? Here is a nice Gist as well which breaks down the large file into the smaller Chunks however this is still missing the .pipe functionality of nodeJs fs which is required to pass to asw-sdk-js upload function.这里还有一个很好的Gist ,它将大文件分解为较小的块,但是这仍然缺少 nodeJs fs 的 .pipe 功能,它需要传递给 asw-sdk-js 上传功能。 Here is a relevant code snippet as well in Node.这是 Node.js 中的相关代码片段。
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) {
console.log('Progress:', evt.loaded, '/', evt.total);
}).
send(function(err, data) { console.log(err, data) });
Is there something similar available in Plain JS (non nodeJs)?在普通 JS(非 nodeJs)中是否有类似的东西? Useable with Rails.可与 Rails 一起使用。
Specifically, an alternative to the following line in Plain JS.具体来说,可以替代普通 JS 中的以下行。
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
The same link you provided contains an implementation intended for the Browser, and it also uses the AWS client SDK . 您提供的同一链接包含用于浏览器的实现,并且它还使用AWS 客户端 SDK 。
// Get our File object
var file = $('#file-chooser')[0].files[0];
// Upload the File
var bucket = new AWS.S3({params: {Bucket: 'myBucket'});
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.upload(params, function (err, data) {
$('#results').html(err ? 'ERROR!' : 'UPLOADED.');
});
** EDITS ** ** 编辑 **
Note the documentation for the Body
field includes Blob
, which means streaming will occur:请注意Body
字段的文档包括Blob
,这意味着将发生流式传输:
Body — (Buffer, Typed Array, Blob, String, ReadableStream)正文 —(缓冲区、类型化数组、Blob、字符串、ReadableStream)
You can also use the Event Emitter convention in the client offered by the AWS SDK's ManagedUpload interface if you care to monitor progress .如果您想监控进度,您还可以在 AWS 开发工具包的ManagedUpload接口提供的客户端中使用 Event Emitter 约定。 Here is an example:下面是一个例子:
var managed = bucket.upload(params)
managed.on('httpUploadProgress', function (bytes) {
console.log('progress', bytes.total)
})
managed.send(function (err, data) {
$('#results').html(err ? 'ERROR!' : 'UPLOADED.');
})
If you want to read the file from your local system in chunks before you send to s3.uploadPart , you'll want to do something with Blob.slice , perhaps defining a Pipe Chain .如果您想在发送到s3.uploadPart之前以块的形式从本地系统读取文件,您需要对Blob.slice做一些事情,也许定义一个Pipe Chain 。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.