简体   繁体   English

使用AWS SDK for Node.js将二进制文件上载到S3

[英]Upload a binary file to S3 using AWS SDK for Node.js

Update: For future reference, Amazon have now updated the documentation from what was there at time of asking. 更新:为了供将来参考,亚马逊现在已经更新了文档。 As per @Loren Segal's comment below:- 根据@Loren Segal的评论如下: -

We've corrected the docs in the latest preview release to document this parameter properly. 我们已更正最新预览版中的文档,以正确记录此参数。 Sorry about the mixup! 抱歉混淆!


I'm trying out the developer preview of the AWS SDK for Node.Js and want to upload a zipped tarball to S3 using putObject . 我正在尝试AWS SDK for Node.Js的开发人员预览,并希望使用putObject将压缩的tarball上传到S3。

According to the documentation , the Body parameter should be... 根据文档Body参数应该是......

Body - (Base64 Encoded Data) 正文 - (Base64编码数据)

...therefore, I'm trying out the following code... ...因此,我正在尝试以下代码......

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });

// Read in the file, convert it to base64, store to S3
fs.readFile('myarchive.tgz', function (err, data) {
  if (err) { throw err; }

  var base64data = new Buffer(data, 'binary').toString('base64');

  var s3 = new AWS.S3();
  s3.client.putObject({
    Bucket: 'mybucketname',
    Key: 'myarchive.tgz',
    Body: base64data
  }).done(function (resp) {
    console.log('Successfully uploaded package.');
  });

});

Whilst I can then see the file in S3, if I download it and attempt to decompress it I get an error that the file is corrupted. 虽然我可以在S3中看到该文件,但如果我下载它并尝试解压缩它,我会收到文件已损坏的错误。 Therefore it seems that my method for 'base64 encoded data' is off. 因此,我的“base64编码数据”方法似乎已关闭。

Can someone please help me to upload a binary file using putObject ? 有人可以帮我用putObject上传二进制文件吗?

You don't need to convert the buffer to a base64 string. 您不需要将缓冲区转换为base64字符串。 Just set body to data and it will work. 只需将body设置为数据即可。

Here is a way to send a file using streams, which might be necessary for large files and will generally reduce memory overhead: 这是一种使用流发送文件的方法,这可能是大文件所必需的,并且通常会减少内存开销:

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });

// Read in the file, convert it to base64, store to S3
var fileStream = fs.createReadStream('myarchive.tgz');
fileStream.on('error', function (err) {
  if (err) { throw err; }
});  
fileStream.on('open', function () {
  var s3 = new AWS.S3();
  s3.putObject({
    Bucket: 'mybucketname',
    Key: 'myarchive.tgz',
    Body: fileStream
  }, function (err) {
    if (err) { throw err; }
  });
});

I was able to upload my binary file this way. 我能够以这种方式上传我的二进制文件。

var fileStream = fs.createReadStream("F:/directory/fileName.ext");
var putParams = {
    Bucket: s3bucket,
    Key: s3key,
    Body: fileStream
};
s3.putObject(putParams, function(putErr, putData){
    if(putErr){
        console.error(putErr);
    } else {
        console.log(putData);
    }
});

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM