Been looking around the net for an answer to this, but not found anything conclusive.
I have a node application that (potentially) needs to make a large number of HTTP GET requests.
Let's say http://foo.com/bar allows an 'id' query parameter, and I have a large number of IDs to process (~1k), ie
http://foo.com/bar?id=100
http://foo.com/bar?id=101
etc.
What libraries that folks have used might be best suited to this task?
I guess I'm looking for something between a queue and a connection pool:
The setup:
The process:
X
number of 'workers' is defined X
concurrent workers running at a time) Any experience welcome
It was actually a lot simpler than I initially thought, and only requires Bluebird (I'm paraphrasing here a little bit since my final code ended up much more complex):
var Promise = require('bluebird');
...
var allResults = [];
...
Promise.map(idList, (id) => {
// For each ID in idList, make a HTTP call
return http.get( ... url: 'http://foo.com/bar?id=' + id ... )
.then((httpResposne) => {
return allResults.push(httpResposne);
})
.catch((err) => {
var errMsg = 'ERROR: [' + err + ']';
console.log(errMsg + (err.stack ? '\n' + err.stack : ''));
});
}, { concurrency: 10 }) // Max of 10 concurrent HTTP calls at once
.then(() => {
// All requests are now complete, return all results
return res.json(allResults);
});
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.