简体   繁体   English

使用 NodeJS 将文件上传到 Amazon S3

[英]Upload a file to Amazon S3 with NodeJS

I ran into a problem while trying to upload a file to my S3 bucket.我在尝试将文件上传到我的 S3 存储桶时遇到了问题。 Everything works except that my file paramters do not seem appropriate.一切正常,除了我的文件参数似乎不合适。 I am using Amazon S3 sdk to upload from nodejs to s3.我正在使用 Amazon S3 sdk 从 nodejs 上传到 s3。

These are my routes settings:这些是我的路线设置:

var multiparty = require('connect-multiparty'),
    multipartyMiddleware = multiparty();
app.route('/api/items/upload').post(multipartyMiddleware, items.upload);

This is items.upload() function:这是 items.upload() 函数:

exports.upload = function(req, res) {
    var file = req.files.file;
    var s3bucket = new AWS.S3({params: {Bucket: 'mybucketname'}});
    s3bucket.createBucket(function() {
        var params = {
            Key: file.name,
            Body: file
        };
        s3bucket.upload(params, function(err, data) {
            console.log("PRINT FILE:", file);
            if (err) {
                console.log('ERROR MSG: ', err);
            } else {
                console.log('Successfully uploaded data');
            }
        });
    });
};

Setting Body param to a string like "hello" works fine.Body参数设置为"hello"类的字符串可以正常工作。 According to doc , Body param must take (Buffer, Typed Array, Blob, String, ReadableStream) Object data.根据docBody参数必须采用(Buffer, Typed Array, Blob, String, ReadableStream) 对象数据。 However, uploading a file object fails with the following error message:但是,上传文件对象失败并显示以下错误消息:

[Error: Unsupported body payload object]

This is the file object:这是文件对象:

{ fieldName: 'file',
  originalFilename: 'second_fnp.png',
  path: '/var/folders/ps/l8lvygws0w93trqz7yj1t5sr0000gn/T/26374-7ttwvc.png',
  headers: 
   { 'content-disposition': 'form-data; name="file"; filename="second_fnp.png"',
     'content-type': 'image/png' },
  ws: 
   { _writableState: 
      { highWaterMark: 16384,
        objectMode: false,
        needDrain: true,
        ending: true,
        ended: true,
        finished: true,
        decodeStrings: true,
        defaultEncoding: 'utf8',
        length: 0,
        writing: false,
        sync: false,
        bufferProcessing: false,
        onwrite: [Function],
        writecb: null,
        writelen: 0,
        buffer: [],
        errorEmitted: false },
     writable: true,
     domain: null,
     _events: { error: [Object], close: [Object] },
     _maxListeners: 10,
     path: '/var/folders/ps/l8lvygws0w93trqz7yj1t5sr0000gn/T/26374-7ttwvc.png',
     fd: null,
     flags: 'w',
     mode: 438,
     start: undefined,
     pos: undefined,
     bytesWritten: 261937,
     closed: true },
  size: 261937,
  name: 'second_fnp.png',
  type: 'image/png' }

Any help will be greatly appreciated!任何帮助将不胜感激!

So it looks like there are a few things going wrong here.所以看起来这里有一些问题。 Based on your post it looks like you are attempting to support file uploads using the connect-multiparty middleware.根据您的帖子,您似乎正在尝试使用connect-multiparty中间件支持文件上传。 What this middleware does is take the uploaded file, write it to the local filesystem and then sets req.files to the the uploaded file(s).这个中间件的作用是获取上传的文件,将其写入本地文件系统,然后将req.files设置为上传的文件。

The configuration of your route looks fine, the problem looks to be with your items.upload() function.您的路线配置看起来不错,问题似乎出在您的items.upload()函数上。 In particular with this part:特别是这部分:

var params = {
  Key: file.name,
  Body: file
};

As I mentioned at the beginning of my answer connect-multiparty writes the file to the local filesystem, so you'll need to open the file and read it, then upload it, and then delete it on the local filesystem.正如我在回答开头提到的, connect-multiparty将文件写入本地文件系统,因此您需要打开文件并读取它,然后上传它,然后在本地文件系统上将其删除。

That said you could update your method to something like the following:也就是说,您可以将方法更新为如下所示:

var fs = require('fs');
exports.upload = function (req, res) {
    var file = req.files.file;
    fs.readFile(file.path, function (err, data) {
        if (err) throw err; // Something went wrong!
        var s3bucket = new AWS.S3({params: {Bucket: 'mybucketname'}});
        s3bucket.createBucket(function () {
            var params = {
                Key: file.originalFilename, //file.name doesn't exist as a property
                Body: data
            };
            s3bucket.upload(params, function (err, data) {
                // Whether there is an error or not, delete the temp file
                fs.unlink(file.path, function (err) {
                    if (err) {
                        console.error(err);
                    }
                    console.log('Temp File Delete');
                });

                console.log("PRINT FILE:", file);
                if (err) {
                    console.log('ERROR MSG: ', err);
                    res.status(500).send(err);
                } else {
                    console.log('Successfully uploaded data');
                    res.status(200).end();
                }
            });
        });
    });
};

What this does is read the uploaded file from the local filesystem, then uploads it to S3, then it deletes the temporary file and sends a response.这样做是从本地文件系统读取上传的文件,然后将其上传到 S3,然后删除临时文件并发送响应。

There's a few problems with this approach.这种方法存在一些问题。 First off, it's not as efficient as it could be, as for large files you will be loading the entire file before you write it.首先,它的效率并不高,因为对于大文件,您将在写入之前加载整个文件。 Secondly, this process doesn't support multi-part uploads for large files (I think the cut-off is 5 Mb before you have to do a multi-part upload).其次,这个过程不支持大文件的分段上传(我认为在你必须进行分段上传之前截止是 5 Mb)。

What I would suggest instead is that you use a module I've been working on called S3FS which provides a similar interface to the native FS in Node.JS but abstracts away some of the details such as the multi-part upload and the S3 api (as well as adds some additional functionality like recursive methods).相反,我建议您使用我一直在研究的名为S3FS的模块,它提供与Node.JS中的本机FS类似的接口,但抽象了一些细节,例如分段上传和 S3 api (以及添加一些附加功能,如递归方法)。

If you were to pull in the S3FS library your code would look something like this:如果您要引入 S3FS库,您的代码将如下所示:

var fs = require('fs'),
    S3FS = require('s3fs'),
    s3fsImpl = new S3FS('mybucketname', {
        accessKeyId: XXXXXXXXXXX,
        secretAccessKey: XXXXXXXXXXXXXXXXX
    });

// Create our bucket if it doesn't exist
s3fsImpl.create();

exports.upload = function (req, res) {
    var file = req.files.file;
    var stream = fs.createReadStream(file.path);
    return s3fsImpl.writeFile(file.originalFilename, stream).then(function () {
        fs.unlink(file.path, function (err) {
            if (err) {
                console.error(err);
            }
        });
        res.status(200).end();
    });
};

What this will do is instantiate the module for the provided bucket and AWS credentials and then create the bucket if it doesn't exist.这将做的是实例化提供的存储桶和 AWS 凭证的模块,然后创建存储桶(如果它不存在)。 Then when a request comes through to upload a file we'll open up a stream to the file and use it to write the file to S3 to the specified path.然后,当请求上传文件时,我们将打开文件的流并使用它将文件写入 S3 的指定路径。 This will handle the multi-part upload piece behind the scenes (if needed) and has the benefit of being done through a stream, so you don't have to wait to read the whole file before you start uploading it.这将在幕后处理多部分上传片段(如果需要),并具有通过流完成的好处,因此您不必在开始上传之前等待读取整个文件。

If you prefer, you could change the code to callbacks from Promises .如果您愿意,可以将代码更改为Promises 的回调。 Or use the pipe() method with the event listener to determine the end/errors.或者使用带有事件侦听器的pipe()方法来确定结束/错误。

If you're looking for some additional methods, check out the documentation for s3fs and feel free to open up an issue if you are looking for some additional methods or having issues.如果您正在寻找一些其他方法,请查看 s3fs 的文档,如果您正在寻找一些其他方法或遇到问题,请随时提出问题。

I found the following to be a working solution::我发现以下是一个有效的解决方案:

npm install aws-sdk


Once you've installed the aws-sdk , use the following code replacing values with your where needed.安装 aws-sdk 后,请使用以下代码在需要的地方替换值。

var AWS = require('aws-sdk');
var fs =  require('fs');

var s3 = new AWS.S3();

// Bucket names must be unique across all S3 users

var myBucket = 'njera';

var myKey = 'jpeg';
//for text file
//fs.readFile('demo.txt', function (err, data) {
//for Video file
//fs.readFile('demo.avi', function (err, data) {
//for image file                
fs.readFile('demo.jpg', function (err, data) {
  if (err) { throw err; }



     params = {Bucket: myBucket, Key: myKey, Body: data };

     s3.putObject(params, function(err, data) {

         if (err) {

             console.log(err)

         } else {

             console.log("Successfully uploaded data to myBucket/myKey");

         }

      });

});

I found the complete tutorial on the subject here in case you're looking for references ::我在这里找到了有关该主题的完整教程,以防您正在寻找参考资料 ::


How to upload files (text/image/video) in amazon s3 using node.js 如何使用 node.js 在亚马逊 s3 中上传文件(文本/图像/视频)

Or Using promises:或使用承诺:

const AWS = require('aws-sdk');
AWS.config.update({
    accessKeyId: 'accessKeyId',
    secretAccessKey: 'secretAccessKey',
    region: 'region'
});

let params = {
    Bucket: "yourBucketName",
    Key: 'someUniqueKey',
    Body: 'someFile'
};
try {
    let uploadPromise = await new AWS.S3().putObject(params).promise();
    console.log("Successfully uploaded data to bucket");
} catch (e) {
    console.log("Error uploading data: ", e);
}

Uploading a file to AWS s3 and sending the url in response for accessing the file.将文件上传到 AWS s3 并发送 URL 以响应访问该文件。

Multer is a node.js middleware for handling multipart/form-data, which is primarily used for uploading files. Multer 是一个用于处理 multipart/form-data 的 node.js 中间件,主要用于上传文件。 It is written on top of busboy for maximum efficiency.它写在 busboy 之上,以实现最高效率。 check this npm module here .在这里检查这个 npm 模块。

When you are sending the request, make sure the headers, have Content-Type is multipart/form-data.发送请求时,请确保标头的 Content-Type 为 multipart/form-data。 We are sending the file location in the response, which will give the url, but if you want to access that url, make the bucket public or else you will not be able to access it.我们在响应中发送文件位置,它将提供 url,但如果您想访问该 url,请将存储桶设为公开,否则您将无法访问它。

upload.router.js上传路由器.js

const express = require('express');
const router = express.Router();
const AWS = require('aws-sdk');
const multer = require('multer');
const storage = multer.memoryStorage()
const upload = multer({storage: storage});

const s3Client = new AWS.S3({
    accessKeyId: 'your_access_key_id',
    secretAccessKey: 'your_secret_access_id',
    region :'ur region'
});

const uploadParams = {
         Bucket: 'ur_bucket_name', 
         Key: '', // pass key
         Body: null, // pass file body
};


router.post('/api/file/upload', upload.single("file"),(req,res) => {
    const params = uploadParams;

    uploadParams.Key = req.file.originalname;
    uploadParams.Body = req.file.buffer;

    s3Client.upload(params, (err, data) => {
        if (err) {
            res.status(500).json({error:"Error -> " + err});
        }
        res.json({message: 'File uploaded successfully','filename': 
        req.file.originalname, 'location': data.Location});
    });
});

module.exports = router;

app.js应用程序.js

const express = require('express');
const app = express();

const router = require('./app/routers/upload.router.js');
app.use('/', router);

// Create a Server
  const server = app.listen(8080, () => {
  console.log("App listening at 8080"); 
})

Upload CSV/Excel上传 CSV/Excel

const fs = require('fs');
const AWS = require('aws-sdk');

const s3 = new AWS.S3({
    accessKeyId: XXXXXXXXX,
    secretAccessKey: XXXXXXXXX
});

const absoluteFilePath = "C:\\Project\\test.xlsx";

const uploadFile = () => {
  fs.readFile(absoluteFilePath, (err, data) => {
     if (err) throw err;
     const params = {
         Bucket: 'testBucket', // pass your bucket name
         Key: 'folderName/key.xlsx', // file will be saved in <folderName> folder
         Body: data
     };
      s3.upload(params, function (s3Err, data) {
                    if (s3Err) throw s3Err
                    console.log(`File uploaded successfully at ${data.Location}`);
                    debugger;
                });
  });
};

uploadFile();

Works for me :)对我有用:)

  const fileContent = fs.createReadStream(`${fileName}`);
  return new Promise(function (resolve, reject) {
    fileContent.once('error', reject);
    s3.upload(
      {
        Bucket: 'test-bucket',
        Key: `${fileName + '_' + Date.now().toString()}`,
        ContentType: 'application/pdf',
        ACL: 'public-read',
        Body: fileContent
      },
      function (err, result) {
        if (err) {
          reject(err);
          return;
        }
        resolve(result.Location);
      }
    );
  });```

Thanks to David as his solution helped me come up with my solution for uploading multi-part files from my Heroku hosted site to S3 bucket.感谢 David,他的解决方案帮助我提出了将多部分文件从 Heroku 托管站点上传到 S3 存储桶的解决方案。 I did it using formidable to handle incoming form and fs to get the file content.我使用 formidable 来处理传入的表单和 fs 来获取文件内容。 Hopefully, it may help you.希望,它可以帮助你。

api.service.ts api.service.ts

public upload(files): Observable<any> {  
    const formData: FormData = new FormData(); 
    files.forEach(file => {
      // create a new multipart-form for every file 
      formData.append('file', file, file.name);           
    });   
    return this.http.post(uploadUrl, formData).pipe(
      map(this.extractData),
      catchError(this.handleError)); 
  }
}

server.js服务器.js

app.post('/api/upload', upload);
app.use('/api/upload', router);

upload.js上传.js

const IncomingForm = require('formidable').IncomingForm;
const fs = require('fs');
const AWS = require('aws-sdk');

module.exports = function upload(req, res) {
    var form = new IncomingForm();

    const bucket = new AWS.S3(
      {
        signatureVersion: 'v4',
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-east-1'       
      }
    ); 

    form.on('file', (field, file) => {

        const fileContent = fs.readFileSync(file.path);

        const s3Params = {
            Bucket: process.env.AWS_S3_BUCKET,
            Key: 'folder/' + file.name,
            Expires: 60,             
            Body: fileContent,
            ACL: 'public-read'
        };

        bucket.upload(s3Params, function(err, data) {
            if (err) {
                throw err;
            }            
            console.log('File uploaded to: ' + data.Location);
            fs.unlink(file.path, function (err) {
              if (err) {
                  console.error(err);
              }
              console.log('Temp File Delete');
          });
        });
    });              

    // The second callback is called when the form is completely parsed. 
    // In this case, we want to send back a success status code.
    form.on('end', () => {        
      res.status(200).json('upload ok');
    });

    form.parse(req);
}

upload-image.component.ts上传-image.component.ts

import { Component, OnInit, ViewChild, Output, EventEmitter, Input } from '@angular/core';
import { ApiService } from '../api.service';
import { MatSnackBar } from '@angular/material/snack-bar';

@Component({
  selector: 'app-upload-image',
  templateUrl: './upload-image.component.html',
  styleUrls: ['./upload-image.component.css']
})

export class UploadImageComponent implements OnInit {
  public files: Set<File> = new Set();
  @ViewChild('file', { static: false }) file;
  public uploadedFiles: Array<string> = new Array<string>();
  public uploadedFileNames: Array<string> = new Array<string>();
  @Output() filesOutput = new EventEmitter<Array<string>>();
  @Input() CurrentImage: string;
  @Input() IsPublic: boolean;
  @Output() valueUpdate = new EventEmitter();
  strUploadedFiles:string = '';
  filesUploaded: boolean = false;     

  constructor(private api: ApiService, public snackBar: MatSnackBar,) { }

  ngOnInit() {    
  }

  updateValue(val) {  
    this.valueUpdate.emit(val);  
  }  

  reset()
  {
    this.files = new Set();
    this.uploadedFiles = new Array<string>();
    this.uploadedFileNames = new Array<string>();
    this.filesUploaded = false;
  }

  upload() { 

    this.api.upload(this.files).subscribe(res => {   
      this.filesOutput.emit(this.uploadedFiles); 
      if (res == 'upload ok')
      {
        this.reset(); 
      }     
    }, err => {
      console.log(err);
    });
  }

  onFilesAdded() {
    var txt = '';
    const files: { [key: string]: File } = this.file.nativeElement.files;

    for (let key in files) {
      if (!isNaN(parseInt(key))) {

        var currentFile = files[key];
        var sFileExtension = currentFile.name.split('.')[currentFile.name.split('.').length - 1].toLowerCase();
        var iFileSize = currentFile.size;

        if (!(sFileExtension === "jpg" 
              || sFileExtension === "png") 
              || iFileSize > 671329) {
            txt = "File type : " + sFileExtension + "\n\n";
            txt += "Size: " + iFileSize + "\n\n";
            txt += "Please make sure your file is in jpg or png format and less than 655 KB.\n\n";
            alert(txt);
            return false;
        }

        this.files.add(files[key]);
        this.uploadedFiles.push('https://gourmet-philatelist-assets.s3.amazonaws.com/folder/' + files[key].name);
        this.uploadedFileNames.push(files[key].name);
        if (this.IsPublic && this.uploadedFileNames.length == 1)
        {
          this.filesUploaded = true;
          this.updateValue(files[key].name);
          break;
        } 
        else if (!this.IsPublic && this.uploadedFileNames.length == 3)
        {
          this.strUploadedFiles += files[key].name;          
          this.updateValue(this.strUploadedFiles); 
          this.filesUploaded = true;
          break;
        }
        else
        {
          this.strUploadedFiles += files[key].name + ",";          
          this.updateValue(this.strUploadedFiles); 
        }      
      }
    }    
  }

  addFiles() {
    this.file.nativeElement.click();  
  }

  openSnackBar(message: string, action: string) {
    this.snackBar.open(message, action, {
      duration: 2000,
      verticalPosition: 'top'
    });
  }   

}

upload-image.component.html上传-image.component.html

<input type="file" #file style="display: none" (change)="onFilesAdded()" multiple />
&nbsp;<button mat-raised-button color="primary" 
         [disabled]="filesUploaded" (click)="$event.preventDefault(); addFiles()">
  Add Files
</button>
&nbsp;<button class="btn btn-success" [disabled]="uploadedFileNames.length == 0" (click)="$event.preventDefault(); upload()">
  Upload
</button>
var express = require('express')

app = module.exports = express();
var secureServer = require('http').createServer(app);
secureServer.listen(3001);

var aws = require('aws-sdk')
var multer = require('multer')
var multerS3 = require('multer-s3')

    aws.config.update({
    secretAccessKey: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
    accessKeyId: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
    region: 'us-east-1'
    });
    s3 = new aws.S3();

   var upload = multer({
   storage: multerS3({
    s3: s3,
    dirname: "uploads",
    bucket: "Your bucket name",
    key: function (req, file, cb) {
        console.log(file);
        cb(null, "uploads/profile_images/u_" + Date.now() + ".jpg"); //use  
     Date.now() for unique file keys
    }
  })
   });

 app.post('/upload', upload.single('photos'), function(req, res, next) {

 console.log('Successfully uploaded ', req.file)

 res.send('Successfully uploaded ' + req.file.length + ' files!')

})

Using aws SDK v3使用 aws SDK v3

npm install @aws-sdk/client-s3

Upload code上传代码

import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";

/**
* advisable to save your AWS credentials and configurations in an environmet file. Not inside the code
* AWS lib will automatically load the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY if available in your environment
*/
const s3Client = new S3Client({ region: process.env.AWS_S3_REGION });

/**
* upload a file
* @param file the file object to be uploaded
* @param fileKey the fileKey. could be separated with '/' to nest the file into a folder structure. eg. members/user1/profile.png
*/
export function uploadFile(file, fileKey){
    s3Client.send(new PutObjectCommand({
       Bucket: process.env.MY_AWS_S3_BUCKET,
       Key: fileKey,
       Body: file
    }));
}

And if you want to download如果你想下载

import { GetObjectCommand } from "@aws-sdk/client-s3";
/**
 * download a file from AWS and send to your rest client
 */
app.get('/download', function(req, res, next){
    var fileKey = req.query['fileKey'];

    var bucketParams = {
        Bucket: 'my-bucket-name',
        Key: fileKey,
    };

    res.attachment(fileKey);
    var fileStream = await s3Client.send(new GetObjectCommand(bucketParams));
    // for TS you can add: if (fileStream.Body instanceof Readable)
    fileStream.Body.pipe(res)
});

Here you can find the solution for the above error在这里您可以找到上述错误的解决方案

https://docs.amplify.aws/lib/storage/upload/q/platform/js/#browser-uploads https://docs.amplify.aws/lib/storage/upload/q/platform/js/#browser-uploads

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM