简体   繁体   中英

Send Multiple HTTP requests

I need to write program that looks up information about items using the item ID.

The API only takes one item at a time, so I can only perform one query per item. The API is limited to five simultaneous requests. Any extra results will give the HTTP 429 error.

If have a JavaScript Object which has all the items with their ID's

How do I retrieve the information for all given ID without triggering the simultaneous requests limit, and without performing unnecessary queries for item IDs that have already been seen.

import fetch from "node-fetch";

let itemObject = [
  { itemName: "", itemID: "" },
  { itemName: "", itemID: "" },
  { itemName: "", itemID: "" },
  { itemName: "", itemID: "" },
];

async function sendIDRequests() {
  try {
    const response = await fetch("https://url/items/:ID", {
      headers: {
        Authorization: "",
      },
    });
    if (!response.ok) {
      throw new Error(`${response.status} ${response.statusText}`);
    }
    response
      .text()
      .then((res) => console.log(res))
      .catch((err) => {
        throw new Error(err);
      });
  } catch (error) {
    console.error(error);
  }
}

sendRequests()

There are two approaches that come to my mind for this. A batch processing and a sliding window approach. Batch processing might be easier but using a sliding window will be the more efficient implementation.

Batch processing with Promise.all()

This approach creates batches of requests up to a specified batchSize and only after all requests in a batch are done, the next batch of requests is issued.

You need to add some error handling in case of failed requests here.

import fetch from "node-fetch";

// list of items that you might want to use to compose your URL (not actually used here)
let itemObject = [
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
];


(async () => {
    // number of concurrent requests in one batch
    const batchSize = 4;
    // request counter
    let curReq = 0;
    // as long as there are items in the list continue to form batches
    while (curReq < itemObject.length) {
        // a batch is either limited by the batch size or it is smaller than the batch size when there are less items required
        const end = itemObject.length < curReq + batchSize ? itemObject.length: curReq + batchSize;
        // we know the number of concurrent request so reserve memory for this
        const concurrentReq = new Array(batchSize);
        // issue one request for each item in the batch
        for (let index = curReq; index < end; index++) {
            concurrentReq.push(fetch("https://postman-echo.com/get"))
            console.log(`sending request ${curReq}...`)
            curReq++;
        }
        // wait until all promises are done or one promise is rejected
        await Promise.all(concurrentReq);
        console.log(`requests ${curReq - batchSize}-${curReq} done.`)
    }
})();

Expected result:

sending request 0...
sending request 1...
sending request 2...
sending request 3...
requests 0-4 done.
sending request 4...
sending request 5...
sending request 6...
sending request 7...
requests 4-8 done.
sending request 8...
sending request 9...
sending request 10...
sending request 11...
requests 8-12 done.

Sliding window with semaphore

This approach uses a sliding window and schedules a new request as soon as another request is done while always keeping the request count below or equal to a maximum number of n concurrent requests at any one time. What you need to implement this is a Semaphore .

There is a library for this in JavaScript called async-mutex .

Here a sample program using this library to send 2 requests concurrently to the Postman Echo API. There will never be more requests running concurrently as the semaphore allows for (in your case that limit would be 5, here it's 2).

import { Semaphore } from "async-mutex";
import fetch from "node-fetch";

// list of items that you might want to use to compose your URL (not actually used here)
let itemObject = [
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
    { itemName: "", itemID: "" },
];

(async () => {
    // allow two concurrent requests (adjust for however many are required)
    const semaphore = new Semaphore(2);

    itemObject.forEach(async (item, idx) => {
        // acquire the semaphore
        const [value, release] = await semaphore.acquire();
        // at this point the semaphore has been acquired and the job needs to be done 
        try {
            console.log(`sending request ${idx}...`)
            const response = await fetch("https://postman-echo.com/get")
            if(!response.ok){
                console.log(`request failed with status code ${response.status}`)
            }
        }
        catch (error) {
            console.log("request failed.")
        }
        finally {
            console.log(`request ${idx} done...`)
            // release the semaphore again so a new request can be issued 
            release();
        }
    })
})();

Expected output (order may vary):

sending request 0...
sending request 1...
request 1 done...
sending request 2...
request 2 done...
sending request 3...
request 3 done...
sending request 4...
request 0 done...
sending request 5...
request 4 done...
sending request 6...
request 5 done...
sending request 7...
request 6 done...
sending request 8...
request 7 done...
sending request 9...
request 8 done...
sending request 10...
request 9 done...
sending request 11...
request 10 done...
request 11 done...

Wait for individual API calls to complete

Try await sendRequests() - the pending promise returned by sendRequests() is being discarded as it not being passed to an await operator or had then , catch or finally clauses added to it.

If you want await sendRequests() to be fulfilled after clauses of the promise chain started by response.text() have been executed, rather than simply having been defined (which occurs synchronously inside sendRequests ), add a return statement before response.text() :

 return response.text()
 .then //  ... rest of promise chain code

This forces await sendRequests() to wait for the promise chain processing to be performed.

Counting outstanding requests

Try renaming sendRequests to sendRequest (singular), and writing a node module (perhaps sendRequests ) that keeps count of requests that have been issued but are still waiting for a response. It would return a promise for individual requests but not issue new fetch operation until the count of outstanding requests is below an allowed limit.

The complexity of such a module depends on design criteria:

  • Is it used by a single node server, for a single API, for a single account
  • Does it have to support multiple API URLs, multiple accounts and multiple callers.

The generic solution of using a modular factory function or class constructor to create a tailored sendRequests function may or may not be overkill for your use case.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM