简体   繁体   中英

node.js request.js not firing requests in parallel

(I also posted this as an issue on request's github page - link )

I have an app that uses request to grab API data from an internal API.

I was doing some tests, and it seemed that our app was scaling poorly under heavy load, and API responses were getting incrementally slower.

I deduced this down to the request call we're using, and it looks like request is probably firing these in a queue, not all at once. Here's a test script that shows this behavior:

#!/usr/bin/env node
var request = require("request"),
    logger = require("./logger.js"),
    argv = require("optimist").argv;
var numRequests = argv.requests || argv.r,
    requestsMade = 0,
    wait = argv.wait || argv.w || 0,
    url = argv.url || argv.u || "http://www.google.com",
    requestApi = function(url,callback){
        var requestTime = new Date().getTime();
        request({
                method: "GET",
                uri: url
            },function(err, response, body){
                var totalTime = (new Date().getTime()) - requestTime;
                callback(err, response, body, totalTime);
        });
    },
    doRequest = function(){
        requestsMade++;
        if(requestsMade==numRequests) clearInterval(requestMaker);
        var thisRequest = requestsMade;
        logger.info("Firing Request #"+thisRequest);
        requestApi(url,function(err, response, body, totalTime){
            if(err){
                logger.error("error contacting API ", err, "trying to request ",reqUrl," after ", totalTime, "ms");
            } else {
                logger.info("Api responded to request #"+thisRequest+" after ", totalTime, "ms");
            }
        });
    };
logger.info("Starting Test with " + numRequests + " Requests.");
var requestMaker = setInterval(doRequest,wait);

(logger.js is just a logging tool to print timestamps & set up log levels).

And, a simple test of google shows the incremental slowdown:

$ node requestTester.js -r 20
[Wed 2011-11-2 11:2:24.575 GMT7] INF: Starting Test with 20 Requests.
[Wed 2011-11-2 11:2:24.580 GMT7] INF: Firing Request #1
[Wed 2011-11-2 11:2:24.654 GMT7] INF: Firing Request #2
[Wed 2011-11-2 11:2:24.661 GMT7] INF: Firing Request #3
[Wed 2011-11-2 11:2:24.664 GMT7] INF: Firing Request #4
[Wed 2011-11-2 11:2:24.667 GMT7] INF: Firing Request #5
[Wed 2011-11-2 11:2:24.672 GMT7] INF: Firing Request #6
[Wed 2011-11-2 11:2:24.673 GMT7] INF: Firing Request #7
[Wed 2011-11-2 11:2:24.674 GMT7] INF: Firing Request #8
[Wed 2011-11-2 11:2:24.675 GMT7] INF: Firing Request #9
[Wed 2011-11-2 11:2:24.675 GMT7] INF: Firing Request #10
[Wed 2011-11-2 11:2:24.676 GMT7] INF: Firing Request #11
[Wed 2011-11-2 11:2:24.677 GMT7] INF: Firing Request #12
[Wed 2011-11-2 11:2:24.678 GMT7] INF: Firing Request #13
[Wed 2011-11-2 11:2:24.679 GMT7] INF: Firing Request #14
[Wed 2011-11-2 11:2:24.680 GMT7] INF: Firing Request #15
[Wed 2011-11-2 11:2:24.681 GMT7] INF: Firing Request #16
[Wed 2011-11-2 11:2:24.682 GMT7] INF: Firing Request #17
[Wed 2011-11-2 11:2:24.683 GMT7] INF: Firing Request #18
[Wed 2011-11-2 11:2:24.684 GMT7] INF: Firing Request #19
[Wed 2011-11-2 11:2:24.685 GMT7] INF: Firing Request #20
[Wed 2011-11-2 11:2:25.257 GMT7] INF: Api responded to request #2 after  602 ms
[Wed 2011-11-2 11:2:25.621 GMT7] INF: Api responded to request #1 after  1041 ms
[Wed 2011-11-2 11:2:25.774 GMT7] INF: Api responded to request #3 after  1113 ms
[Wed 2011-11-2 11:2:25.779 GMT7] INF: Api responded to request #4 after  1115 ms
[Wed 2011-11-2 11:2:25.895 GMT7] INF: Api responded to request #5 after  1228 ms
[Wed 2011-11-2 11:2:26.378 GMT7] INF: Api responded to request #9 after  1703 ms
[Wed 2011-11-2 11:2:26.714 GMT7] INF: Api responded to request #7 after  2041 ms
[Wed 2011-11-2 11:2:26.870 GMT7] INF: Api responded to request #8 after  2196 ms
[Wed 2011-11-2 11:2:27.126 GMT7] INF: Api responded to request #10 after  2449 ms
[Wed 2011-11-2 11:2:27.267 GMT7] INF: Api responded to request #6 after  2595 ms
[Wed 2011-11-2 11:2:27.730 GMT7] INF: Api responded to request #14 after  3051 ms
[Wed 2011-11-2 11:2:28.68 GMT7] INF: Api responded to request #13 after  3389 ms
[Wed 2011-11-2 11:2:28.72 GMT7] INF: Api responded to request #11 after  3395 ms
[Wed 2011-11-2 11:2:28.75 GMT7] INF: Api responded to request #12 after  3398 ms
[Wed 2011-11-2 11:2:28.332 GMT7] INF: Api responded to request #16 after  3650 ms
[Wed 2011-11-2 11:2:28.471 GMT7] INF: Api responded to request #15 after  3791 ms
[Wed 2011-11-2 11:2:29.45 GMT7] INF: Api responded to request #18 after  4362 ms
[Wed 2011-11-2 11:2:29.161 GMT7] INF: Api responded to request #17 after  4479 ms
[Wed 2011-11-2 11:2:29.173 GMT7] INF: Api responded to request #19 after  4488 ms
[Wed 2011-11-2 11:2:29.424 GMT7] INF: Api responded to request #20 after  4738 ms

Why does this happen? Is there a config option I missed that will make the requests all fire at once?

I'd hate to have to ditch request as I love it so much, but this is a showstopper for this project.

If you don't want to use v0.5.10 (which is already nearly stable and is the candidate for the upcoming node v0.6.x), you'll have to patch request. In requests main.js, replace the line

var globalPool = {}

with

var globalPool = false

This should remove the limit on concurrent connections.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM