简体   繁体   中英

"Unsupported body payload object" when trying to upload to Amazon S3

I want to upload a file from my frontend to my Amazon S3 (AWS).

I'm using dropzone so I convert my file and send it to my backend.

In my backend my file is like:

{ fieldname: 'file',
originalname: 'test.torrent',
encoding: '7bit',
mimetype: 'application/octet-stream',
buffer: { type: 'Buffer', data: [Array] },
size: 7449 },

and when I try to upload my file with my function:

var file = data.patientfile.file.buffer;

        var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: file };

        s3.upload(params, function (err, data) {
            if (err) {
                console.log("******************",err)
            } else {
                console.log("Successfully uploaded data to myBucket/myKey");
            }
        });

I get as error:

Unsupported body payload object

Do you know how can I send my file?

I have tried to send it with putobject and get a similar error.

I think you might need to convert the file content (which probably in this case is the data.patientfile.file.buffer) to binary

var base64data = new Buffer(data, 'binary');

so the params would be like:

var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: base64data };

Or if I'm mistaken and the buffer is already in binary, then you can try:

var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: data.patientfile.file.buffer};

This is my production code that is working.

Please note the issue can happen at data1111 .

But, to get full idea, add all key parts of working code below.

client:

// html

<input
  type="file"
  onChange={this.onFileChange}
  multiple
/>


// javascript

onFileChange = event => {
    const files = event.target.files;
    var file = files[0];
    var reader = new FileReader();
    reader.onloadend = function(e) {

        // save this data1111 and send to server
        let data1111 = e.target.result // reader.result // ----------------- data1111

    };
    reader.readAsBinaryString(file);
}

server:

// node.js/ javascript

const response = await s3
  .upload({
    Bucket: s3Bucket, // bucket
    Key: s3Path, // folder/file
    // receiving at the server - data1111 - via request body (or other)
    Body: Buffer.from(req.body.data1111, "binary") // ----------------- data1111
  })
  .promise();
return response;

To make the above code working, it took full 2 days.

Hope this helps someone in future.

Implemented Glen k's answer with nodejs ...worked for me

   const AWS = require('aws-sdk');

   const s3 = new AWS.S3({
    accessKeyId: process.env.AWSAccessKeyID,
    secretAccessKey: process.env.AWSSecretAccessKey,
});
     
let base64data = Buffer.from(file.productImg.data, 'binary')

const params = {
  Bucket: BUCKET_NAME,
  Key: KEY, 
  Body: base64data
   }

  s3.upload(params, function(err, data) {
if (err) {
  console.log(err)
    throw err;
}
console.log(data)
console.log(`File uploaded successfully. ${data.Location}`);
})

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM