简体   繁体   English

阻塞请求不在 PM2 上同时运行

[英]Blocking requests not running simultaneously on PM2

In my express app, I have defined 2 endpoints in my application.在我的快速应用程序中,我在我的应用程序中定义了 2 个端点。 One for is-sever-up check and one for simulating a blocking operation.一种用于 is-sever-up 检查,另一种用于模拟阻塞操作。

app.use('/status', (req, res) => {
    res.sendStatus(200);
});

app.use('/p', (req, res) => {
    const { logger } = req;
    logger.info({ message: 'Start' });
    let i = 0;
    const max = 10 ** 10;
    while (i < max) {
        i += 1;
    }
    res.send(`${i}`);
    logger.info({ message: 'End' });
});

I am using winston for logging and PM2 for clustering using the following command我使用winston进行日志记录,使用 PM2 进行集群,使用以下命令

$ pm2 start bin/httpServer.js -i 0

It has launched 4 instances.它已经启动了 4 个实例。

Now, when I visit the routes /p , /p , /status in order in different tabs with around 1 second delay between (request 1 and request 2) and (request 2 and request 3), I expected to get response for request 1 and request 2 after some time but with around 1 second delay and response for request 3 should come instantly.现在,当我在不同的选项卡中按顺序访问路由/p/p/status时,在(请求 1 和请求 2)和(请求 2 和请求 3)之间有大约 1 秒的延迟,我希望得到请求 1 的响应并在一段时间后请求 2,但延迟大约 1 秒,请求 3 的响应应该立即出现。

Actual: The response for request 3 did come instantly but something weird happened with request 1 and request 2. The request 2 didn't even start until request 1 was completed.实际:请求 3 的响应确实立即出现,但是请求 1 和请求 2 发生了一些奇怪的事情。请求 2 甚至直到请求 1 完成才开始。 Here are logs that I got.这是我得到的日志。 You can see the time stamp for the end of request 1 and start of request 2.您可以看到请求 1 结束和请求 2 开始的时间戳。

{"message":"Start","requestId":"5c1f85bd-94d9-4333-8a87-30f3b3885d9c","level":"info","timestamp":"2020-12-28 07:34:48"}
{"message":"End","requestId":"5c1f85bd-94d9-4333-8a87-30f3b3885d9c","level":"info","timestamp":"2020-12-28 07:35:03"}
{"message":"Start","requestId":"f1f86f68-1ddf-47b1-ae62-f75c7aa7a58d","level":"info","timestamp":"2020-12-28 07:35:03"}
{"message":"End","requestId":"f1f86f68-1ddf-47b1-ae62-f75c7aa7a58d","level":"info","timestamp":"2020-12-28 07:35:17"}

Why did the request 1 and request 2 not start at the same time (with 1 second delay, of course)?为什么请求 1 和请求 2 没有同时启动(当然延迟 1 秒)? And if they are running synchronously, why did request 3 respond instantly and not wait for request 1 and 2 to complete?如果它们是同步运行的,为什么请求 3 会立即响应,而不是等待请求 1 和请求 2 完成?

That's because connection of the header is keep-alive in the response which your node server respond in default.这是因为 header 的连接在您的节点服务器默认响应的响应中keep-alive状态。 So, connection will be reused when you use browser (curl also could simulate the reused connection situtation).因此,当您使用浏览器时,连接将被重用(curl 也可以模拟重用连接的情况)。 That means multiple request is served by the same instance within a specified time .这意味着多个请求在指定时间内由同一个实例提供服务。 Even you have multiple node instances.即使您有多个节点实例。

Note: You could see specified time in response header like this Keep-Alive: timeout=5 If you use browser, open network tab to see response headers.注意:您可以在响应中看到指定的时间 header 像这样Keep-Alive: timeout=5如果您使用浏览器,请打开网络选项卡以查看响应标头。
If you use curl, add -v options to see response headers如果您使用 curl,请添加-v选项以查看响应标头

You could try to use multiple separated curl command at the same time in terminal.您可以尝试在终端中同时使用多个单独的curl命令 Separated curl command means connection will not be reused.单独curl命令意味着连接不会被重用。 So, you'll get your expected results.因此,您将获得预期的结果。 You could add a console.log("status test") in /status router.您可以在/status路由器中添加一个console.log("status test") Then, use pm2 logs to see which instance serve the request like following format (these logs are produced by accessing endpoint with browser).然后,使用pm2 logs查看哪个实例服务于请求,如下格式(这些日志是通过使用浏览器访问端点产生的)。

0|server  | status test
0|server  | status test

0 means the first instance, you will see this is all the same instance to serve request when you use browser to access endpoint. 0 表示第一个实例,当您使用浏览器访问端点时,您将看到这是所有相同的实例来服务请求。 But, if you use curl , you'll find out the number is always changed which mean every request is served by different node instance.但是,如果你使用curl ,你会发现这个数字总是改变的,这意味着每个请求都由不同的节点实例提供服务。

You could see I sent two request at the same time with curl in terminal.您可以看到我在终端中使用 curl 同时发送了两个请求。 Then, different node instance to serve the request.然后,不同的节点实例来服务请求。 So, the start and end time of console.log are same.因此, console.log的开始和结束时间是相同的。 In this example, I have 8 event-loop so I could deal with 8 long-processing (synchronous code) request at the same time.在这个例子中,我有 8 个事件循环,所以我可以同时处理 8 个长处理(同步代码)请求。

在此处输入图像描述

And, you could use curl to simulate the keep-alive situation.而且,您可以使用curl来模拟keep-alive情况。 Then, you'll see the request is served by same node instance.然后,您会看到请求由同一个节点实例提供服务。

curl http://localhost:8080/status http://localhost:8080/status -v -H "Connection: keep-alive"

You also could use connection close to see the request is served by different node instance.您还可以使用connection close来查看请求由不同的节点实例提供服务。

curl http://localhost:8080/status http://localhost:8080/status -v -H "Connection: close"

You could see the different here.你可以在这里看到不同之处。

在此处输入图像描述

If you want to close the connection in server side, you could use following code.如果要关闭服务器端的连接,可以使用以下代码。

res.setHeader("Connection", "close")

This is my test code.这是我的测试代码。

const express = require("express")
const app = express();
const port = 8080;

app.use('/status', (req, res) => {
    console.log("status tests");
    res.sendStatus(200);
});

app.use('/p', (req, res) => {
    console.log(new Date() + " start");
    let i = 0;
    const max = 10 ** 10;
    while (i < max) {
        i += 1;
    }
    res.send(`${i}`);
    console.log(new Date() + " end");
});

app.listen(port, () => {
    return console.log(`server is listening on ${port}`);
});

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM