简体   繁体   中英

Node.js - Await for all the promises thrown inside a loop

I'm dealing with a loop in Node.js that performs two tasks for every iteration of a loop. To simplify, the code is summarized in:

  1. Extract products metadata from a web page (blocking task).
  2. Save all the products metadata to a database (asynchronous task).

The save operation ( 2 ) will perform about 800 operations in a database, and it doesn't need to block the main thread (I can still extracting products metadata from the web pages).

So, that being said, awaiting for the products to being saved doesn't have any sense. But if I throw the promises without awaiting for them, in the last iteration of the loop the Node.js process exits and all the pending operations are not finished.

Which is the best approach to solve this? Is it possible to achieve it without having a counter for finished promises or emitters? Thanks.

for (let shop of shops) {
  // 1
  const products = await extractProductsMetadata(shop);

  // 2
  await saveProductsMetadata(products);
}

Collect the promises in an array, then use Promise.all on it:

 const storePromises = [];

 for (let shop of shops) {
    const products = await extractProductsMetadata(shop); //(1)
    storePromises.push(saveProductsMetadata(products)); //(2)
 }

 await Promise.all(storePromises);
 // ... all done (3)

Through that (1) will run one after each other, (2) will run in parallel, and (3) will run afterwards.

For sure you can also run (1) and (2) in parallel:

  await Promise.all(shops.map(async shop => {
    const products = await extractProductsMetadata(shop); //(1)
    await saveProductsMetadata(products);
 }));

And if an error occured in one of the promises, you can handle that with a try / catch block, to make sure all other shops won't be affected:

 await Promise.all(shops.map(async shop => {
  try {
    const products = await extractProductsMetadata(shop); //(1)
    await saveProductsMetadata(products);
   } catch(error) {
     // handle it here
   }
 }));

how to signal node to finish the process ?

You could manually call process.exit(0); , but that hides the real problem: NodeJS exits automatically if there is no listener attached anymore . That means that you should close all database connections / servers / etc. after the code above is done.

We are creating packs of data to treat. When we treat the data, we do all the get synchronously, and all the save asynchronously.

I have not handled the failure part, I let you add it to it. appropriate try/catch or function encapsulation will do it.

/**
 * Call the given functions that returns promises in a queue
 * options = context/args
 */
function promiseQueue(promisesFuncs, options = {}, _i = 0, _ret = []) {
  return new Promise((resolve, reject) => {
    if (_i >= promisesFuncs.length) {
      return resolve(_ret);
    }

    // Call one
    (promisesFuncs[_i]).apply(options.context || this, options.args || [])
      .then((ret: any) => promiseQueue(promisesFuncs, _i + 1, options, [
        ..._ret,

        ret,
      ]))
      .then(resolve)
      .catch(reject);
  });
}

function async executePromiseAsPacks(arr, packSize, _i = 0) {
  const toExecute = arr.slice(_i * packSize, packSize);

  // Leave if we did execute all packs
  if (toExecute.length === 0) return true;

  // First we get all the data synchronously
  const products = await promiseQueue(toExecute.map(x => () => extractProductsMetadata(x)));

  // Then save the products asynchronously
  // We do not put await here so it's truly asynchronous
  Promise.all(toExecute.map((x, xi) => saveProductsMetadata(products[xi])));

  // Call next
  return executePromiseAsPacks(arr, packSize, _i + 1);
}

// Makes pack of data to treat (we extract synchronously and save asynchronously)
// Made to handle huge dataset
await executePromisesAsPacks(shops, 50);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM