简体   繁体   中英

How to effectively watch for large number of file changes?

I'm looking to write a daemon/service with the ability to effectively watch multiple files for changes and trigger an event when the file is written. The issue here is that for what I have in mind, I might be tracking potentially hundreds of files on my relatively weak computer and I'm worried about performance issues.

I was thinking about using fswatch , but I'm not sure about the performance given that are various types of monitors for different operating systems and especially for potentially hundreds of files.

I'm looking for a cross-platform solution. Preferably C++ or Nodejs. Any help would be very much appreciated!

There are quite a few npm packages other than fs.watch , the performance is dependent on many factors, as you said that your system is bit slow in terms of performance. You can use Chokidar or hound . They are good options to go for.

const chokidar = require('chokidar');

// One-liner for current directory
chokidar.watch('.').on('all', (event, path) => {
  console.log(event, path);
});

Also, you can divide your directory in smaller chunks and assign them to child processes which can help to update the parent whenever there are any changes in that specific worker thread. However, this is just a suggestion because if any worker thread stopped working you might need to keep track of what files you need to assign it again and you have to implement it by your own(bit complicated).

I suggest you to implement using npm packages or fs.watch itself.

You can check other events from the documentation for fs.watch .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM