简体   繁体   中英

How node.js server serve next request, if current request have huge computation?

Suppose I am using a node server and there is an api which is generating a series from 1 to 1 millon (ie very huge cpu operation) so in this case other request which is coming to server is queued (and have long wait for their turn this kills user experience) because of node is single threaded.

Is there any other solution we can do with node.js for not waiting the other requests for their turn so long?

How node.js server serve next request, if current request have huge computation?

It doesn't - if that computation happens on the main thread and is not divided into smaller parts.

To have a chance of serving other request during a CPU-intensive task, you need to either:

  1. break up your computation into parts and use setImmediate or process.nextTick to run them
  2. use an external process for that task and call it like any other external program or service using HTTP, TCP, IPC or child process spawning, or using a queue system, pub/sub etc.
  3. write a native add-on in C++ and use threads for that

What's important is that you need the stack to unroll often in your V8 thread so that the event loop has a chance of handling events as often as possible. And keep in mind that when you have a long computation that takes 10 second and you divide it into 1000 smaller parts your server will still get blocked from serving new requests or any other I/O or event 1000 times for a duration of 10ms each time.

If you have a lot of CPU-heavy operations then I would strongly recommend moving them out of your process that serves the requests, not only because of blocking the event loop but also because in such a case you want to utilize all of your cores at the same time so it would be optimal to have as many processes (or threads) doing the CPU-heavy work as the cores in your CPU (or possibly more with hyper threading) and to have all of your I/O-bound operations in a separate process that doesn't process CPU-heavy operations by itself.

Single threaded doesn't mean that the processes will be scheduled by First Come First Serve. I seriously don't think that multiple requests are processed First Come First Serve style, so this won't be much of a problem. The overall system will slow down due to requests that take too long to process though.

And for that, node has a solution:

https://nodejs.org/api/cluster.html

What this does is, you can basically spawn multiple instances of your app, all running at the same port, so if you have multiple requests, a very small fraction of which is taking too long, then the other child processes in the cluster can respond to the subsequent requests.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM