简体   繁体   English

使用 AWS-SDK-JS 使用普通 Javascript 将大文件作为流上传到 s3

[英]Upload large files as a stream to s3 with Plain Javascript using AWS-SDK-JS

There is a pretty nice example available for uploading large files to s3 via aws-sdk-js library but unfortunately this is using nodeJs fs.有一个非常好的示例可用于通过 aws-sdk-js 库将大文件上传到 s3,但不幸的是这是使用 nodeJs fs。

Is there a way we can achieve the same thing in Plain Javascript?有没有办法可以在普通 Javascript 中实现相同的功能? Here is a nice Gist as well which breaks down the large file into the smaller Chunks however this is still missing the .pipe functionality of nodeJs fs which is required to pass to asw-sdk-js upload function.这里还有一个很好的Gist ,它将大文件分解为较小的块,但是这仍然缺少 nodeJs fs 的 .pipe 功能,它需要传递给 asw-sdk-js 上传功能。 Here is a relevant code snippet as well in Node.这是 Node.js 中的相关代码片段。

var fs = require('fs');
var zlib = require('zlib');

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
  on('httpUploadProgress', function(evt) {
    console.log('Progress:', evt.loaded, '/', evt.total); 
  }).
  send(function(err, data) { console.log(err, data) });

Is there something similar available in Plain JS (non nodeJs)?在普通 JS(非 nodeJs)中是否有类似的东西? Useable with Rails.可与 Rails 一起使用。

Specifically, an alternative to the following line in Plain JS.具体来说,可以替代普通 JS 中的以下行。

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());

The same link you provided contains an implementation intended for the Browser, and it also uses the AWS client SDK . 您提供的同一链接包含用于浏览器的实现,并且它还使用AWS 客户端 SDK

// Get our File object
var file = $('#file-chooser')[0].files[0];

// Upload the File
var bucket = new AWS.S3({params: {Bucket: 'myBucket'});
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.upload(params, function (err, data) {
  $('#results').html(err ? 'ERROR!' : 'UPLOADED.');
});

** EDITS ** ** 编辑 **

Note the documentation for the Body field includes Blob , which means streaming will occur:请注意Body字段的文档包括Blob ,这意味着将发生流式传输:

Body — (Buffer, Typed Array, Blob, String, ReadableStream)正文 —(缓冲区、类型化数组、Blob、字符串、ReadableStream)

You can also use the Event Emitter convention in the client offered by the AWS SDK's ManagedUpload interface if you care to monitor progress .如果您想监控进度,您还可以在 AWS 开发工具包的ManagedUpload接口提供的客户端中使用 Event Emitter 约定。 Here is an example:下面是一个例子:

var managed = bucket.upload(params)
managed.on('httpUploadProgress', function (bytes) {
  console.log('progress', bytes.total)
})
managed.send(function (err, data) {
  $('#results').html(err ? 'ERROR!' : 'UPLOADED.');
})

If you want to read the file from your local system in chunks before you send to s3.uploadPart , you'll want to do something with Blob.slice , perhaps defining a Pipe Chain .如果您想在发送到s3.uploadPart之前以块的形式从本地系统读取文件,您需要对Blob.slice做一些事情,也许定义一个Pipe Chain

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用AWS-SDK-JS通过CloudFront分发进行S3分段上传 - S3 Multipart upload via cloudfront distribution with aws-sdk-js 如何使用JavaScript在AWS S3上上传大型文件和多个文件? - How to upload large and multiple files on aws s3 using javascript? 使用 AWS SDK (S3.putObject) 上传一个 Readable stream 到 S3 (node.js) - Using AWS SDK (S3.putObject) to upload a Readable stream to S3 (node.js) 使用angular + aws-sdk-js +预签名网址将文件上传到S3 - Uploading a file to S3 with angular + aws-sdk-js + pre-signed url 使用aws js sdk,无需使用cognito,即可将浏览器中的大文件(> 10GB)直接从浏览器安全地分段上传到s3 - Secure multi-part Upload of large files(>10GB) directly from browser to s3 using aws js sdk without using cognito 如何在浏览器中使用 javascript sdk 在 aws s3 存储桶中上传多部分文件 - How to upload multipart files in aws s3 bucket using javascript sdk in browser 将aws-sdk-js与CognitoSync服务一起使用时出现InvalidSignatureException - InvalidSignatureException while using aws-sdk-js with CognitoSync service S3:如何使用aws-sdk在nodejs中使用S3上传大文件 - S3 : How to upload a large file using S3 in nodejs using aws-sdk AWS SDK 文件使用 stream PassThrough 通过 Node/Express 上传到 S3 - 文件总是损坏 - AWS SDK file upload to S3 via Node/Express using stream PassThrough - file is always corrupt AWS SDK JS S3 getObject 流元数据 - AWS SDK JS S3 getObject Stream Metadata
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM