简体   繁体   English

使用 NodeJS 将多个文件上传到 AWS S3

[英]Uploading multiple files to AWS S3 using NodeJS

I'm trying to upload all files within my directory to my S3 bucket using NodeJS.我正在尝试使用 NodeJS 将目录中的所有文件上传到我的 S3 存储桶。 I'm able to upload one file at a time if I explicitly give the file path + literal string for the Key: field.如果我明确地为Key:字段提供文件路径 + 文字字符串,我可以一次上传一个文件。

Below is the script I'm using:下面是我正在使用的脚本:

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' });

// reg ex to match
var re = /\.txt$/;

// ensure that this file is in the directory of the files you want to run the cronjob on

// ensure that this file is in the directory of the files you want to run the cronjob on
fs.readdir(".", function(err, files) {
    if (err) {
    console.log( "Could not list the directory.", err)
    process.exit( 1 )
    }


    var matches = files.filter( function(text) { return re.test(text) } )
    console.log("These are the files you have", matches)
    var numFiles = matches.length


    if ( numFiles ) {
        // Read in the file, convert it to base64, store to S3

        for( i = 0; i < numFiles; i++ ) {
            var fileName = matches[i]

            fs.readFile(fileName, function (err, data) {
                if (err) { throw err }

                // Buffer Pattern; how to handle buffers; straw, intake/outtake analogy
                var base64data = new Buffer(data, 'binary');


                var s3 = new AWS.S3()
                    s3.putObject({
                       'Bucket': 'noonebetterhaventakenthisbucketnname',
                        'Key': fileName,
                        'Body': base64data,
                        'ACL': 'public-read'
                     }, function (resp) {
                        console.log(arguments);
                        console.log('Successfully uploaded, ', fileName)
                    })
            })

        }

    }

})

It produces this error for each file attempted to upload to S3:对于每个尝试上传到 S3 的文件,它都会产生此错误:

These are the files you have [ 'test.txt', 'test2.txt' ]
{ '0': null,
  '1': { ETag: '"2cad20c19a8eb9bb11a9f76527aec9bc"' } }
Successfully uploaded,  test2.txt
{ '0': null,
  '1': { ETag: '"2cad20c19a8eb9bb11a9f76527aec9bc"' } }
Successfully uploaded,  test2.txt

edit : updated using a variable name to allow key to be read instead of matches[i]编辑:使用变量名称更新以允许读取密钥而不是matches[i]

Why does it only upload test2.txt , and how do I get it to upload each file within my matches variable?为什么它只上传test2.txt ,我如何让它上传我的matches变量中的每个文件?

Referenced this Asynchronously reading and caching multiple files in nodejs to arrive at a solution.参考了这个Asynchronously reading and caching multiple files in nodejs来解决。

tl;dr scope issue - need to wrap variables in closure; tl;dr范围问题 - 需要将变量包装在闭包中; can do this by creating a function for the readFile and s3.putObject and calling that within the for loop.可以通过为readFiles3.putObject创建一个函数并在 for 循环中调用它来做到这一点。

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' });

var s3 = new AWS.S3()

function read(file) {
    fs.readFile(file, function (err, data) {
        if (err) { throw err }

        // Buffer Pattern; how to handle buffers; straw, intake/outtake analogy
        var base64data = new Buffer(data, 'binary');

        s3.putObject({
           'Bucket': 'noonebetterhaventakenthisbucketnname',
            'Key': file,
            'Body': base64data,
            'ACL': 'public-read'
         }, function (resp) {
            console.log(arguments);
            console.log('Successfully uploaded, ', file)
        })
    })
}

// reg ex to match
var re = /\.txt$/;

// ensure that this file is in the directory of the files you want to run the cronjob on
fs.readdir(".", function(err, files) {
    if (err) {
        console.log( "Could not list the directory.", err)
        process.exit( 1 )
    }

    var matches = files.filter( function(text) { return re.test(text) } )
    console.log("These are the files you have", matches)
    var numFiles = matches.length


    if ( numFiles ) {
        // Read in the file, convert it to base64, store to S3

        for( i = 0; i < numFiles; i++ ) {
            read(matches[i])
        }

    }

})

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM