简体   繁体   中英

Nodejs Split large array and make multiple API calls

I have a CSV file that contains 21k records(1 word alphanumeric type/line). I need to read these records and send them to an API in JSON key-value pair format for some processing that accepts only 500 elements at a time. I have a solution on my mind but I wanted to know that is there a better or more efficient solution/Algorithm for this?

Algorithm:

  1. Load the CSV into an array
  2. Split this 1D array into N array with fix length of 500 columns(elements)
  3. With each of these N number of 500 element Array, prepare JSON payload and send to API.

Code:

var dataArray = [];

fs.readFile(inputPath, 'utf8', function (err, data) {
    dataArray = data.split(/\r?\n/);  
 })


var temp = [];
for(i=0;i<dataArray.length;){
  temp=[];
 for(j=0;(j<500 && i<dataArray.length);j++){  
    temp.push(data[i]);
    i++;
  }
  // make API call with current values of temp array
  makeCallToAPI(temp);
}

I'd use lodash or underscore _.chunk() . Also note that both the fs and API are better handled async.

const _ = require('lodash');

async function callApi(chunk) {
  // return a promise that resolves with the result of the api
}

async function readFS(inputPath) {
  return new Promise((resolve, reject) => {
    fs.readFile(inputPath, 'utf8', function (err, data) {
      if (err) reject(err);
      else resolve(data.split(/\r?\n/));
    });
  });
}

async function doTheWork(inputPath) {
  const data = await readFS(inputPath);
  const chunks = _.chunk(data, 500)
  const promises = chunks.map(callApi)
  return _.flatten(Promise.all(promises));
}

Also note the use of _.flatten() , since the last Promise.all() will resolve to an array of arrays of chunks of promises.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM