简体   繁体   中英

Blocking requests not running simultaneously on PM2

In my express app, I have defined 2 endpoints in my application. One for is-sever-up check and one for simulating a blocking operation.

app.use('/status', (req, res) => {
    res.sendStatus(200);
});

app.use('/p', (req, res) => {
    const { logger } = req;
    logger.info({ message: 'Start' });
    let i = 0;
    const max = 10 ** 10;
    while (i < max) {
        i += 1;
    }
    res.send(`${i}`);
    logger.info({ message: 'End' });
});

I am using winston for logging and PM2 for clustering using the following command

$ pm2 start bin/httpServer.js -i 0

It has launched 4 instances.

Now, when I visit the routes /p , /p , /status in order in different tabs with around 1 second delay between (request 1 and request 2) and (request 2 and request 3), I expected to get response for request 1 and request 2 after some time but with around 1 second delay and response for request 3 should come instantly.

Actual: The response for request 3 did come instantly but something weird happened with request 1 and request 2. The request 2 didn't even start until request 1 was completed. Here are logs that I got. You can see the time stamp for the end of request 1 and start of request 2.

{"message":"Start","requestId":"5c1f85bd-94d9-4333-8a87-30f3b3885d9c","level":"info","timestamp":"2020-12-28 07:34:48"}
{"message":"End","requestId":"5c1f85bd-94d9-4333-8a87-30f3b3885d9c","level":"info","timestamp":"2020-12-28 07:35:03"}
{"message":"Start","requestId":"f1f86f68-1ddf-47b1-ae62-f75c7aa7a58d","level":"info","timestamp":"2020-12-28 07:35:03"}
{"message":"End","requestId":"f1f86f68-1ddf-47b1-ae62-f75c7aa7a58d","level":"info","timestamp":"2020-12-28 07:35:17"}

Why did the request 1 and request 2 not start at the same time (with 1 second delay, of course)? And if they are running synchronously, why did request 3 respond instantly and not wait for request 1 and 2 to complete?

That's because connection of the header is keep-alive in the response which your node server respond in default. So, connection will be reused when you use browser (curl also could simulate the reused connection situtation). That means multiple request is served by the same instance within a specified time . Even you have multiple node instances.

Note: You could see specified time in response header like this Keep-Alive: timeout=5 If you use browser, open network tab to see response headers.
If you use curl, add -v options to see response headers

You could try to use multiple separated curl command at the same time in terminal. Separated curl command means connection will not be reused. So, you'll get your expected results. You could add a console.log("status test") in /status router. Then, use pm2 logs to see which instance serve the request like following format (these logs are produced by accessing endpoint with browser).

0|server  | status test
0|server  | status test

0 means the first instance, you will see this is all the same instance to serve request when you use browser to access endpoint. But, if you use curl , you'll find out the number is always changed which mean every request is served by different node instance.

You could see I sent two request at the same time with curl in terminal. Then, different node instance to serve the request. So, the start and end time of console.log are same. In this example, I have 8 event-loop so I could deal with 8 long-processing (synchronous code) request at the same time.

在此处输入图像描述

And, you could use curl to simulate the keep-alive situation. Then, you'll see the request is served by same node instance.

curl http://localhost:8080/status http://localhost:8080/status -v -H "Connection: keep-alive"

You also could use connection close to see the request is served by different node instance.

curl http://localhost:8080/status http://localhost:8080/status -v -H "Connection: close"

You could see the different here.

在此处输入图像描述

If you want to close the connection in server side, you could use following code.

res.setHeader("Connection", "close")

This is my test code.

const express = require("express")
const app = express();
const port = 8080;

app.use('/status', (req, res) => {
    console.log("status tests");
    res.sendStatus(200);
});

app.use('/p', (req, res) => {
    console.log(new Date() + " start");
    let i = 0;
    const max = 10 ** 10;
    while (i < max) {
        i += 1;
    }
    res.send(`${i}`);
    console.log(new Date() + " end");
});

app.listen(port, () => {
    return console.log(`server is listening on ${port}`);
});

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM