conserve
conserve copied to clipboard
Repeated reads of some files causes suboptimal performance on slow or remote filesystems
To do: trace file IO, investigate any case where an archive file is avoidably read more than once, fix it.
Perhaps: emit a warning if the same file is read more than once.
#175 just fixed by @WolverinDEV fixes one pathological case of repeatedly re-reading files.
However, there are some other cases where it reads a small file repeatedly in a way that is cheap on a local filesystem (where it will be in cache) but might be very slow remotely. It's definitely worth fixing, and I think I have fixed some in the sftp branch, but there are probably more.
Originally posted by @sourcefrog in https://github.com/sourcefrog/conserve/issues/177#issuecomment-1214192996