How about many files?
Not a bug, just a question how hound handles this. If I watch a folder where a lot of files get copied to, will the processes all run at once? Let's say I copy 200 files in there which should be processed (a deeper processing of the files is done). I've concerns that this blocks CPU or IO. Is it possible to just process a defined number of files at once?
Thanks in advance!
I know this is an old one, but I am curious about high loads as well. I am doing some testing and I am already seeing issues with files going in and out.
I haven't used this project back then and it seems that the last commit was around 2018. But hound seems just to be a wrapper around fs.watch: https://nodejs.org/docs/latest/api/fs.html#fs_fs_watch_filename_options_listener
fs.watch seems to use inotify on linux for example and I've made good experience in respect to IO/CPU with inotify, it just eats memory (up to 1k per file which isn't that much nowadays) It may differ on macOS or Windows (see link what is used there)
And it may also depend what's your process after detecting a change. And to answer my question: It doesn't seem to be possible to process just a defined number with hound, looking at the code ... So in short: hound just manages watching and unwatching of files, reducing loads while processing changed files must be done on your own. For example: You may collect all changed files somewhere in an array and process them in an own process, which has some timeout-statements or runs only at specified times.