简体   繁体   中英

Implementing producers/consumers with bacon.js event stream pool

I want to implement the producer consumer pattern with a pool of bacon.js event streams. Here's the specific problem I'm trying to solve:

I have a list of 'n' urls. I want to create event streams to make http requests for those urls, but I want to limit it to 'x' streams ('x' network requests) at a time. In the event handler for the above streams, I create a new event stream that writes the http response to a file. But I want to limit the number of streams writing to file to 'y' at a time.

In Gevent/Java, I'd create thread pools of appropriate size and use threads from the appropriate thread pool. How do I do something similar for eventstreams?

Using flatMapWithConcurrencyLimit you'll be able to control the number of spawned streams:

function fetchUsingHttp(url) { .. } // <- returns EventStream of http result
function writeToFile(data) { .. } // <- returns EventStream of file write result
var urls; // <- EventStream of urls
var maxRequests, maxWrites; // <- maximum concurrency limits
var httpResults = urls.flatMapWithConcurrencyLimit(maxRequests, fetchUsingHttp)
var fileWriteResults = httpResults.flatMapWithConcurrencyLimit(maxWrites, writeToFile)
fileWriteResults.log()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM