简体   繁体   中英

Azure function ( blob trigger ) read CSV file with Nodejs

What is the best way with Nodejs to read the myBlob buffer parameter in a azure function and write it into a table storage because I'm trying with a file with size of 16MB and it's too slow.

Another thing is when I use context.log() to see the content of the file, it doesn't show all the content of the file but only part of this.

module.exports = async function (context, myBlob) {

     let data = myBlob.toString("utf8");

     context.log("context", data);
};

You can use getBlobToStream or getBlobToText in nodejs to get the blob data. below is the function.json which is needed for reading the blob:

{
  "bindings": [
    {
      "name": "Blob",
      "type": "blobTrigger",
      "dataType": "binary",
      "direction": "in",
      "path": "folderpath",
      "connection": "CONN_STR"
    }
  ]
}

In relate with the code you've provided I add few changes respect to it as below:

var blobName = 'BlobName';
blobService.getBlobToText(
    containerName,
    blobName,
    function(err, blobContent, blob) {
        if (err) {
            console.error("Couldn't download blob %s", blobName);
            console.error(err);
        } else {
            console.log("Sucessfully downloaded blob %s", blobName);
            console.log(blobContent);
        }
    });

Converting to the stream as below:

module.exports = async function (context, myBlob) {
    context.log(context.bindings.myBlob.toString());
    context.log("Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
};

Also check this SO for more insights.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM