简体   繁体   中英

How to use Promise.all to handle thousands of request

In my React app I have a component that send request to an online service that is capable to handle 50 requests max. I got a new request now to execute 7000 MAC's.

function App() {
const [data, setData] = useState([]);

useEffect(() => {
     const fetchData = async () => {
        await axios.all([
             axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 }),
             axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 })
       // Adding all the mac address ....... 
        ]).then((responseArr) => {
            setData(responseArr)
        });
     };
    fetchData();
}, []);

I would like to extend the fetchData function so basically it will send only 50 IP's and will wait till iteration is complete.

When the iteration is complete then the next 50 will be executed.

Thank you

Here is how you can do it without any external libraries:

const ips = [
  /* List of mac address. */
];

useEffect(() => {
  const fetchData = async () => {
    const loadedData = [];
    
    // Iterate over the slices of array until all the ips have been processed.
    for (const sliceIps of sliceGenerator(ips)) {
      const gettingData = sliceIps.map(getDataFromIp);
      const sliceLoadedData = await axios.all(gettingData);
      loadedData = loadedData.concat(sliceLoadedData);
    }
    setData(loadedData);
  };
  fetchData();
}, []);

const getDataFromIp = (ip) =>
  axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 });

// Generates a slice of an array, here the slice has a size of 50 max.
function* sliceGenerator(arr) {
  const sliceSize = 50;
  let i = 0;
  while (i < arr.length) {
    yield arr.splice(i, i + sliceSize);
    i += sliceSize;
  }
}

I'm using a generator function* sliceGenerator here to generate the slices of ips array. This way you batch process them 50 by 50.

I'm also using a for (... of ...) loop. It's very convenient because you can use the await keyword inside.

Without library, you could use this function:

function poolPromises(iterPromises, poolSize) {
    return new Promise((resolve, reject) => {
        let promises = [];
        function nextPromise() {
            let { value, done } = iterPromises.next();
            if (done) {
                resolve(Promise.all(promises));
            } else {
                promises.push(value); // value is a promise
                value.then(nextPromise, reject);
            }
            return !done;
        }
        
        while (promises.length < poolSize && nextPromise()) { }
    });
}

This function will take promises from an iterator up to the pool size. Whenever a promise resolves, it will get the next promise from the iterator so the pool is complete again. So the pool does not have to be emptied completely before the next chunk of promises is generated. As soon as a spot is free it will be used again.

It is important that the iterator only creates a next promise when one is pulled from it via the next() method.

In your use case, you can call it as follows:

const fetchData = async () => {
    function * iterRequests() {
        for (let macAddress of macAddresses) {
            yield axios.get("/ipdn/" + macAddress, { timeout: 10000 });
        }
    }
    return poolPromises(iterRequests(), 50).then(setData);
}    

Note: fetchData does not have to be declared async , since there is no await in there.

If you don't have any issues in using external library, you can make use of es6-promise-pool to manage concurrent requests as below

import PromisePool from 'es6-promise-pool';


// macs - Array of mac addresses
useEffect(() => {
  const fetchData = () => {
    const results = [];
    const generatePromises = function*() {
      for (let count = 0; count < macs.length; count++) {
        yield axios.get(`/ipdn/${macs[count]}`, ...);
      }
    }
    const promiseIterator = generatePromises();
    // Create a pool with 10 concurrent requests max
    const pool = new PromisePool(
      promiseIterator,
      10 // Configurable
    );
    // To listen to result
    pool.addEventListener('fulfilled', function (event) {
      console.log('Fulfilled: ' + event.data.result);
      results.push(event.data.result);
    });
    // Start the pool
    pool.start().then(function () {
      setData(results);
      console.log('Complete');
    });
  };
  fetchData();
}, []);

I am not familiar with axios.all , then I provide a way just use Promise.all . My idea is split the input array to each block 50 addresses, then solve it one by one

function App() {
  const [data, setData] = useState([]);

  useEffect(() => {
    // helper function, split array to chunked array
    const splitToChunks = (items, chunkSize = 50) => {
      const result = [];
      for (let i = 0; i < items.length; i += chunkSize) {
        result.push(items.slice(i, i + chunkSize));
      }
      return result;
    }
    
    const fetchData = async () => {
      const result = []; // init value
      const macAddresses = []; // array of mac addresses - your mac addresses
    
      const chunkedArray = splitToChunks(macAddresses); // return array of array [[...50 mac adds], [], []]
    
      for (const macs of chunkedArray) { // now macs is [...50 mac adds]
        const promises = macs.map((mac) => {
          return axios.get(`/ipdn/${mac}`, { timeout: 10000 });
        });
        // now promises is array contains 50 Promises
        const response = await Promise.all(promises); // wait until finish 50 requests
        result.push(...response); // copy response to result, and continue for next block
      }
    
      setData(result);
    };
    fetchData();
  }, []);
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM