简体   繁体   中英

node.js - mutual variable

I'm new to node.js, so before releasing my node.js app, I need to be sure it will work as it should.

Let's say I have an array variable and I intialize it on beginning of my script

myArray = [];

then I pull some data from an external API, store it inside myArray, and use setInterval() method to pull this data again each 30 minutes:

pullData();
setInterval(pullData, 30*60*1000);

pullData() function takes about 2-3 seconds to finish.

Clients will be able to get myArray using this function:

http.createServer(function(request, response){
var path = url.parse(request.url).pathname;
if(path=="/getdata"){

    var string = JSON.stringify(myArray);
    response.writeHead(200, {'Content-Type': 'text/plain'});
    response.end(string);              

}
}).listen(8001);

So what I'm asking is, can next situation happen?: An client tries to get data from this node.js server, and in that same moment, data is being written into myArray by pullData() function, resulting in invalid data being sent to client?

I read some documentation, and what I realized is that when pullData() is running, createServer() will not respond to clients until pullData() finishes its job? I'm really not good at understanding concurrent programming, so I need your confirmation on this, or if you have some better solution?

EDIT: here is the code of my pullData() function:

 var now = new Date();

Date.prototype.addDays = function(days){

        var dat = new Date(this.valueOf());
        dat.setDate(dat.getDate() + days);
        return dat;
}


var endDateTime = now.addDays(noOfDays);
var formattedEnd = endDateTime.toISOString(); 

var url = "https://api.mindbodyonline.com/0_5/ClassService.asmx?wsdl";
    soap.createClient(url, function (err, client) {
        if (err) {
            throw err;
        }

        client.setEndpoint('https://api.mindbodyonline.com/0_5/ClassService.asmx');
        var params = {
            "Request": {
                "SourceCredentials": {
                    "SourceName": sourceName,
                    "Password": password,
                    "SiteIDs": {
                        "int": [siteIDs]
                    }
                },
                "EndDateTime" : formattedEnd

            }
        };


client.Class_x0020_Service.Class_x0020_ServiceSoap.GetClasses(params, function (errs, result) {
            if (errs) {
                console.log(errs);
            } else {

                    var classes = result.GetClassesResult.Classes.Class;
                    myArray = [];

                    for (var i = 0; i < classes.length; i++) {
                        var name = classes[i].ClassDescription.Name;
                        var staff = classes[i].Staff.Name;
                        var locationName = classes[i].Location.Name;
                        var start = classes[i].StartDateTime.toISOString();
                        var end = classes[i].EndDateTime.toISOString();
                        var klasa = new Klasa(name,staff,locationName,start,end);

                        myArray.push(klasa);
                    }

                    myArray.sort(function(a,b){
                        var c = new Date(a.start);
                        var d = new Date(b.start);
                        return c-d;
                    });

                    string = JSON.stringify(myArray);
     }
        })


    });

No, NodeJs is not multi-threaded and everything run on a single thread, this means except non-blocking calls (ie. IO) everything else will engage CPU until it returns, and NodeJS absolutely doesn't return half-way populated array to the end user, as long as you only do one HTTP call to populate your array .

Update: As pointed out by @RyanWilcox any asynchronous (non-blocking syscall) call may hint NodeJS interpreter to leave your function execution half way and return to it later.

In general: No.

JavaScript is single threaded. While one function is running, no other function can be.

The exception is if you have delays between functions that access the value of an array.

eg

var index = i;
function getNext() {
    async.get(myArray[i], function () {
        i++;
        if (i < myArray.length) {
            getNext()
        }
    });
}

… in which case the array could be updated between the calls to the asynchronous function.

You can mitigate that by creating a deep copy of the array when you start the first async operation.

Javascript is single threaded language so you don't have to be worried about this kind of concurrency. That means no two parts of code are executed at the same time. Unlike many other programming languages, javascript has different concurrency model based on event loop . To achieve best performance, you should use non-blocking operations handled by callback functions, promises or events. I suppose that your external API provides some asynchronous i/o functions what is well suited for node.js.

If your pullData call doesn't take too long, another solution is to cache the data.

Fetch the data only when needed (so when the client accesses /getdata). If it is fetched you can cache the data with a timestamp. If the /getdata is called again, check if the cached data is older than 30 minutes, if so fetch again.

Also parsing the array to json..

var string = JSON.stringify(myArray);

..might be done outside the /getdata call, so this does not have to be done for each client visiting /getdata. Might make it slightly quicker.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM