简体   繁体   English

使用bacon.js事件流池实现生产者/消费者

[英]Implementing producers/consumers with bacon.js event stream pool

I want to implement the producer consumer pattern with a pool of bacon.js event streams. 我想用bacon.js事件流池实现生产者使用者模式。 Here's the specific problem I'm trying to solve: 这是我要解决的特定问题:

I have a list of 'n' urls. 我有一个“ n”个网址列表。 I want to create event streams to make http requests for those urls, but I want to limit it to 'x' streams ('x' network requests) at a time. 我想创建事件流以对这些URL发出HTTP请求,但我想一次将其限制为“ x”个流(“ x”网络请求)。 In the event handler for the above streams, I create a new event stream that writes the http response to a file. 在上述流的事件处理程序中,我创建了一个新的事件流,该事件流将http响应写入文件。 But I want to limit the number of streams writing to file to 'y' at a time. 但是我想将一次写入文件的流的数量限制为“ y”。

In Gevent/Java, I'd create thread pools of appropriate size and use threads from the appropriate thread pool. 在Gevent / Java中,我将创建适当大小的线程池并使用适当线程池中的线程。 How do I do something similar for eventstreams? 我如何为事件流做类似的事情?

Using flatMapWithConcurrencyLimit you'll be able to control the number of spawned streams: 使用flatMapWithConcurrencyLimit,您将能够控制生成的流的数量:

function fetchUsingHttp(url) { .. } // <- returns EventStream of http result
function writeToFile(data) { .. } // <- returns EventStream of file write result
var urls; // <- EventStream of urls
var maxRequests, maxWrites; // <- maximum concurrency limits
var httpResults = urls.flatMapWithConcurrencyLimit(maxRequests, fetchUsingHttp)
var fileWriteResults = httpResults.flatMapWithConcurrencyLimit(maxWrites, writeToFile)
fileWriteResults.log()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM