简体   繁体   中英

Node.js / Is for loop is Async?

I have the following code:

for (var i = 0; i < files.length; i++) {
    var data = fs.readFileSync(files[i]);
    fs.appendFileSync("file_part", data);
}

Let's say I have 1000 files. Is it the for loop async? Or My application will wait until it will finish the process?

If it sync, how I make it async? so each iterate will be async?

Edit :

This code is part of a server code, using express . We get a lot of https request, and serves the client. One of the operation is to appends files of 1MB to one big file. The server should return 200 if the operation success, 500 otherwise.

The problem is that if the code above (the for loop) is executed, The express module stop serving other clients, and it "busy" with this request. I try to use async.eachSeries (because the order of the files has a meaning, so I have to use Series method), but it still stucks the server.

All control structures in JS are executed synchronously. Since you have used no asynchronous functions in the loop body (but the explicitly synchronous counterparts), it will wait for them to finish their tasks.


To make the snippet non-blocking, you need to use the regular asynchronous fs methods, and chain them appropriately. I would recommend promises, but if you are already using , then it should look like this:

async.eachSeries(files, function(file, cb) {
    fs.readFile(file, function(err, data) {
        if (err) return cb(err);
        fs.appendFile("file_part", data, cb);
    });
}, function(err) {
    // call back to express here with 500 if err and 200 otherwise
});

There are several problems in both the question and what you're trying to do. You may want to either clarify the question or clarify what you really want to do.

The short answer to your question: No. A for loop is not, itself, asynchronous.

A slightly longer answer: It isn't clear you need the loop itself to be async, but you may just want the contents to be async. So instead of calling readFileSync you may want to use readFile which takes a callback. In this callback, you can append to the file. This will "fire off" 1000 read files, which will call their callback when ready.

But there is still a problem. If you read the 1000 files asynchronously, and save them to the same file as you read them in, you won't get the file contents in any order and may even have file content interleaving. I can't imagine why you actually want to do this.

Update: Your updated question is still a little vague - if you need to wait until everything is done to send the code 200 reply... then you're going to have to wait until all the files are appended. There really isn't any other solution.

If, however, you can return code 200 immediately to indicate you're working on the file appends, and then do the appending in the background, you can execute the loop as part of an asynchronous task. You can do this inside a process.nextTick() call or in a promise or in any other way that makes sense to your flow.

If all you want to do is to concat a list of files together asynchronously you can just do it in the callback to fs.readFile :

function concatFiles (source_array, destination) {
    var source = source_array.shift();

    fs.readFile(source,function(err,data){
        fs.appendFile(destination,data,function(err){
            if (source_array.length) {
                concatFiles(source_array, destination);
            }
        });
    });
}

// Usage:
concatFiles(files,"file_part");

Note that for clarity I didn't handle errors in the code above. In production code you should check for errors and handle them appropriately.

It's fairly obvious what's going on. Pop off the first file in the source list. Then read that file asynchronously. Then when that's done write to the destination file asynchronously. Repeat until source list is empty.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM