I want to upload some large files directly to s3 via the browser with NodeJS, it is unclear how to prepare this file for upload to s3. There might be a better module (like Knox) to handle this case but I am not sure. Any thoughts?
File Object
file: {
webkitRelativePath: '',
lastModifiedDate: '2013-06-22T02:43:54.000Z',
name: '04-Bro Safari & UFO! - Animal.mp3',
type: 'audio/mp3',
size: 11082039
}
S3 putObject
var params = {Bucket: 'bucket_name/'+req.user._id+'/folder', Key: req.body['file']['name'], Body: ???};
s3.putObject(params, function(err, data) {
if (err)
console.log(err);
else
console.log("Successfully uploaded data to myBucket/myKey");
});
Streaming is now supported ( see docs ), simply pass the stream as the Body
:
var fs = require('fs');
var someDataStream = fs.createReadStream('bigfile');
var s3 = new AWS.S3({ params: { Bucket: 'myBucket', Key: 'myKey' } });
s3.putObject({ Body: someDataStream, ... }, function(err, data) {
// handle response
})
The s3.putObject()
method does not stream, and from what I see, the s3 module doesn't support streaming. However, with Knox , you can use Client.putStream()
. Using the file object from your question, you can do something like this:
var fs = require('fs');
var knox = require('knox');
var stream = fs.createReadStream('./file');
var client = knox.createClient({
key: '<api-key-here>',
secret: '<secret-here>',
bucket: 'learnboost'
});
var headers = {
'Content-Length': file.size,
'Content-Type': file.type
};
client.putStream(stream, '/path.ext', headers, function(err, res) {
// error or successful upload
});
One option is to use multer-s3 instead: https://www.npmjs.com/package/multer-s3 .
This post has some details also: Uploading images to S3 using NodeJS and Multer. How to upload whole file onFileUploadComplete
Your code isn't streaming. You need to see a call to pipe
somewhere or at least code to pipe by hand by using data
event handlers. You are probably using the express bodyParser middleware, which is NOT a streaming implementation. It stores the entire request body as a temporary file on the local filesystem.
I'm not going to provide specific suggestions because of the promising results I got from a web search for "node.js s3 stream". Spend 5 minutes reading, then post a snippet that is at least an attempt at streaming and we can help you get it right once you have something in the ballpark.
The v3, the PutObjectCommand can not write file stream to S3. We need to use the @aws-sdk/lib-storage
library for uploading buffers and streams.
Example:
const upload = async (fileStream) => {
const uploadParams = {
Bucket : 'test-bucket',
Key : 'image1.png',
Body: fileStream,
}
try {
const parallelUpload = new Upload({
client: s3Client,
params: uploadParams,
});
console.log('Report progress..')
parallelUpload.on("httpUploadProgress", (progress) => {
console.log(progress);
});
await parallelUpload.done();
} catch (e) {
console.log(e);
}
}
Ref - https://github.com/aws/aws-sdk-js-v3/blob/main/UPGRADING.md#s3-multipart-upload
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.