简体   繁体   中英

Uploading multiple files from a Google Cloud VM to Google Cloud Storage using node.js and Glob

I am trying Node.js to upload multiple files from my Google Compute Engine VM local directory to a GCS bucket that I have already created. I am getting the following error each time I am running the script.

TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received type function

The script:

 `// Imports the Google Cloud client library const {Storage} = require('@google-cloud/storage'); const fs = require ('fs'); const glob = require('glob'); // The name of the bucket to access, eg "my-bucket" const bucketName = "myBucket"; // Instantiates a client const storage = new Storage({ projectId: 'myprojectID', keyFilename: 'my GCS service key' }); //get files in the local directory of VM var allfiles = glob('folder/*.js', function (err, files) { if (err) { console.log(err); } }); // Uploads VM local dir files to the bucket storage .bucket(bucketName) .upload(allfiles) .then(() => { console.log(`${allfiles} uploaded to ${bucketName}.`); }) .catch(err => { console.error('ERROR:', err); });'

Apparently the 'Upload' process needs the file pathnames as string. But that's what the Glob function is supposed to do. Still why is the error?

Any help will be seriously appreciated!

Your mistake is that you use allfiles as the return value for glob . This is not correct, the file names are available in the callback (since glob is async), not in the return value.

glob('folder/*.js', function (err, files) { 
    if (err) { 
        console.log(err); 
    }

    var allfiles = files;

    // Uploads VM local dir files  to the bucket
   storage
  .bucket(bucketName)
  .upload(allfiles)
  .then(() => {
    console.log(`${allfiles} uploaded to ${bucketName}.`);
  })
  .catch(err => {
    console.error('ERROR:', err);
  });'  
});

So I finally made the script to work. The problem was, the filepath was being captured as an object, but Google Cloud Storage service needs the path as string.

The path then had to be changed to a string using JSON.stringify and then split the resuting array to select only the file path content.

 // Imports the Google Cloud client library const {Storage} = require('@google-cloud/storage'); const fs = require('fs'); var bucketName = 'intended GCS bucketname'; const storage = new Storage({ projectId: 'full project Id', keyFilename: 'cloud service key' }); //Invoke Glob for reading filepath in local/vm var glob = require('glob'); glob('path to local/vm directory', function (err, files) { if (err) { console.log(err); } var list = JSON.stringify(files) ; list = list.split('"') ; var listone = list[1] ; storage .bucket(bucketName) .upload(listone) .then(() => { console.log(`${listone} uploaded to ${bucketName}.`); }) .catch(err => { console.error('ERROR:', err); }); });

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM