[英]NodeJs - How to convert chunks of data to new Buffer?
在NodeJS中,我有来自文件上传的大量数据,这些数据部分地保存了文件。 我想通过执行新的Buffer()进行转换,然后将其上传到Amazon s3
如果只有一个块,这会起作用,但是当有多个块时,我不知道如何做新的Buffer()
当前,我的解决方案是将数据块写入我自己的服务器上的真实文件中,然后将该文件的PATH发送到Amazon s3。
如何跳过文件创建步骤,然后实际将缓冲区发送给Amazon s3?
我想你需要使用streaming-s3
var streamingS3 = require('streaming-s3');
var uploadFile = function (fileReadStream, awsHeader, cb) {
//set options for the streaming module
var options = {
concurrentParts: 2,
waitTime: 20000,
retries: 2,
maxPartSize: 10 * 1024 * 1024
};
//call stream function to upload the file to s3
var uploader = new streamingS3(fileReadStream, aws.accessKey, aws.secretKey, awsHeader, options);
//start uploading
uploader.begin();// important if callback not provided.
// handle these functions
uploader.on('data', function (bytesRead) {
console.log(bytesRead, ' bytes read.');
});
uploader.on('part', function (number) {
console.log('Part ', number, ' uploaded.');
});
// All parts uploaded, but upload not yet acknowledged.
uploader.on('uploaded', function (stats) {
console.log('Upload stats: ', stats);
});
uploader.on('finished', function (response, stats) {
console.log(response);
cb(null, response);
});
uploader.on('error', function (err) {
console.log('Upload error: ', err);
cb(err);
});
};
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.