简体   繁体   中英

Event Emitters vs Promises for parallel execution?

I'm making a slew of modules, each of which will need to run code during various phases. I could use a Promise.all for each phase like:

const phase1promise = Promise.all([module1.phase1(),module2.phase1()]);
phase1promise.then(// do next phases)

Or, I could use an event emitter from a "master" module that submodules listen to in order to know when to run code for phases. In turn, the master module would listen to those submodules' event emitters to know when they're done for that phase. I rigged up this event emitter system and it's working but I'm starting to think that promises might be better, especially with respect to code running in parallel. Also, maybe promises could be considered a more standard pattern. Thoughts?

Use events for things that happen more than once - for sequential data use a more specialized type of event emitters: Streams.

Use promises for things that happen once.

Promises are by design a request-response design pattern. You basically:

doSomething.then(processResponse);

With async/await, promises have become a very powerful tool to do various types of request-response: parallel, series, batch etc:

// Parallel
results = await Promise.all(a,b,c);

// Serial
for (i=0; i<tasks.length; i++) {
    results.push(await tasks[i]());
}

// Batch 10 tasks in parallel
for (i=0; i<tasks.length; i += 10) {
    currentTasks = tasks.splice(0,10);
    results.push.apply(results, await Promise.all(currentTasks.map(t => t())))
}

However, Promises are not designed to intercept multiple events. Once a promise has resolved its state changes to resolved.

This is where generic event emitters come in. For things like onclick listeners, waiting for requests over the network (see Express.js), waiting for keyboard input - promises cannot be used (unless of course you intend to stop listening to further events after processing one event).

For things that are inherently requests for some data use promises.

But note that both of these are just design patterns for how to manage asynchronous processes. The do not make functions asynchronous. And also note that asynchronous processes may or may not be multi threaded. For network I/O they are single threaded - javascript basically has parallel wait , not parallel execution of instructions. There are however modules that allow you to start new threads or processes (web workers in browsers and child_process in node.js)


Worker threads

If you look at web workers you will find that the API is event based. This is as it should be because workers cannot know when and how many jobs the master process will ask it to do. But you can easily wrap your own master process API in a promise because it is basically doing a request-response (provided of course that each request will only trigger the worker to send back just one response).

So sometimes it's OK to do both - use the design pattern that makes sense.

If it's just one "start" event and only one "end" event per module, and there's a master orchestrating all of this, then promises will be (by far.) simpler.

If the modules are supposed to register themselves with the master and/or emit multiple events for different parts, you will have greater flexibility (and a bit less coupling), but a much more complex system that's less clear to understand - needing to look at all files to figure out the dependency graph.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM