Does it make sense to have an async.reduce
if all I do in the iterator function is an http-request to the same host (+ scraping in the request's callback)? I'm asking because the maximum TCP connections to the same host at one time seems to be one (ie by using console.log
in my code, it seems it is only after the scraping finishes in one iteration, that another http-request is made, regardless of how long it takes to receive a response from the web server).
Solved it myself. I didn't know that async.reduce
processes the iterations in series, not in parallel.
Source: https://github.com/caolan/async#reducearr-memo-iterator-callback
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.