Dat crashes when a file is quickly created and deleted, even when ignored
I am reporting: Dat crashes when a file is quickly created and deleted, even when the file is set to be ignored in .datignore.
Bug Report
Please give us details about your installation to assist you. Run dat -v to see the version of Dat you are using.
- Operating system: Debian
- Node Version: v12.10.0
- Dat Version:13.13.1
I am trying to store a borg repository in dat. It creates a file called lock.exclusive while writing to the repository and deletes this when it is done. I have added lock.exclusive to .datignore, in many formats just to be safe, but dat still crashes regardless:
My .datignore includes:
lock.exclusive/*
lock.*
lock.exclusive
/lock.exclusive/*
/lock.exclusive
Expected behavior
Dat should not look at ignored files and would not crash. Even if it is watching the file, it should handle the error gracefully and continue running.
Actual behavior
Dat crashes.
Debug Logs
nicolas@intranet:/mnt/extstorage/backups/privatesurvival-borg$ dat .
dat v13.13.1
dat://832065020821021379c1e8f1d4f0312885a2521edc0e23e69bdfb6f9a0924343
Sharing dat: 611 files (8.1 GB)
2 connections | Download 0 B/s Upload 213 B/s
Watching for file updates
DEL: /hints.1101
DEL: /index.1101
DEL: /integrity.1101
ADD: index.1105 (656 KB)
ADD: hints.1105 (558 B)
ADD: integrity.1105 (190 B)
DEL: /data/1/1102
Ctrl+C to Exitinternal/fs/watchers.js:173
throw error;
^
Error: ENOENT: no such file or directory, watch '/mnt/extstorage/backups/privatesurvival-borg/lock.exclusive'
at FSWatcher.start (internal/fs/watchers.js:165:26)
at Object.watch (fs.js:1340:11)
at /usr/lib/node_modules/dat/node_modules/recursive-watch/index.js:112:18
at FSReqCallback.oncomplete (fs.js:170:5) {
errno: -2,
syscall: 'watch',
code: 'ENOENT',
path: '/mnt/extstorage/backups/privatesurvival-borg/lock.exclusive',
filename: '/mnt/extstorage/backups/privatesurvival-borg/lock.exclusive'
}
Thanks for the bug report @nicolaschan! We will be moving towards a new version of the dat cli (2.0) within the next few months, information forthcoming on how to upgrade. Hopefully the new version will fix your issue.