[英]Node.js Express application taking too much time to process outbound HTTP requests under load testing
I have an express (node.js v0.10.29) proxy on a linux machine (Ubuntu 12.04 64bit, 3.75GB Memory, 1 Core) which is making some outbound HTTP requests for each incoming request. 我在linux机器上有一个express(node.js v0.10.29)代理(Ubuntu 12.04 64位,3.75GB内存,1个Core),它为每个传入请求发出一些出站HTTP请求。 While load testing I found out that the response time becomes very slow (around 30 seconds for a request that making 4 outbound requests when firing 1000).
在进行负载测试时,我发现响应时间变得非常慢(对于发出1000时发出4个出站请求的请求,大约需要30秒)。 After some investigation I made sure that the outbound requests are the bottleneck and eliminated the machine limitation (the cpu and memory are not getting higher then 20%, and I increased the number of open files to 10000).
经过一些调查后,我确保出站请求是瓶颈并消除了机器限制(CPU和内存不会高于20%,并且我将打开文件的数量增加到10000)。 First I was using request module for the outbound request, tried changing it to http module, and for both of them tried increasing the
globalAgent.maxSockets
, using agent = false
, using my own agent with any number of maxSockets
, setting request.setNoDelay(true)
, using cluster
, but nothing made any change on the results of my load testing. 首先我使用请求模块作为出站请求,尝试将其更改为http模块,并且他们两个尝试使用
agent = false
增加globalAgent.maxSockets
,使用我自己的代理与任意数量的maxSockets
,设置request.setNoDelay(true)
,使用cluster
,但没有对我的负载测试结果做任何改变。
what can be the problem? 可能是什么问题?
Here is my latest code for the HTTP request: 这是我最新的HTTP请求代码:
var http = require('http');
var agent = new http.Agent();
agent.maxSockets = 100;
var doPost = function(reqUrl, body, next, onError) {
var stringBody = JSON.stringify(body);
var headers = {
'Content-Type': 'application/json',
'Accept-Encoding': 'gzip',
'Content-Length': stringBody.length
};
var parsedUrl = url.parse(reqUrl);
var options = {
host: parsedUrl.host,
path: parsedUrl.path,
method: 'POST',
headers: headers,
//agent: false
agent: agent
};
doHttpRequest(options, stringBody, next, onError);
};
function doHttpRequest(options, body, next, onError, HttpContext){
var req = http.request(options, function(res) {
var chunks = [];
res.on('data', function (chunk) {
chunks.push(chunk);
});
res.on('end', function(){
var buffer = Buffer.concat(chunks);
var encoding = res.headers['content-encoding'];
if (encoding == 'gzip') {
zlib.gunzip(buffer, function(error, decoded) {
var jsonRes = JSON.parse(decoded && decoded.toString());
next(jsonRes);
});
} else if (encoding == 'deflate') {
zlib.inflate(buffer, function(error, decoded) {
var jsonRes = JSON.parse(decoded && decoded.toString());
next(jsonRes);
});
} else {
next(null, buffer.toString());
}
});
});
req.setNoDelay(true);
req.write(body);
req.end();
req.on('error', function(e) {
log(e);
});
}
the "next" method will call the "doPost" function a few times (in this case 4 times). “next”方法将调用“doPost”函数几次(在这种情况下为4次)。
I am seeing the same behaviour, I have simple proxy module created. 我看到了相同的行为,我创建了简单的代理模块。 When I call a endpoint directly its taking 50ms, but via proxy server (which internal make another req and pipe to original req) it taking double ~100ms.
当我直接调用一个端点时它需要50ms,但是通过代理服务器(内部产生另一个req和管道到原始req)它需要加倍~100ms。 I have also tried all options what you mentioned here.
我也试过你在这里提到的所有选项。
I am glad I am not alone having this issue, will be digging into further. 我很高兴我并不是唯一一个有这个问题的人,会进一步深入研究。 Will let you know if found something.
如果找到了什么会让你知道。
UPDATE UPDATE
After setting keepAlive=true on agent, I was able to get result for proxied request with ~5ms more than direct call. 在代理上设置keepAlive = true后,我能够获得比直接调用大约5ms的代理请求的结果。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.