Log compression and transmission progress for large backups
Is your feature request related to a problem? Please describe. I have a docker compose for immich and I've added docker-volume-backup for each of the services that bind to my host filesystem. My photo data is around 80GiB. All the smaller service backup to my Azure storage container successfully, but the larger one does not. It is really hard to debug where the problem is as it takes a long time to get the error
Describe the solution you'd like I think it would be a good idea to be able to see the progress of the compression and transmission in the console. This could be every minute or so. The goal would be to confirm that progress is being made.
Describe alternatives you've considered I've not considered any alternatives. I'm trying to debug my implementation by messing with the lock timeout (which is probably the root issue)
Additional context This is a pretty cool service and I love how easy it is to backup each volume as part of the docker compose.
I think this would be a great addition, but it's definitely not trivial to add (which is probably why it's not there yet). To get this working, someone would have to normalize the behavior of all storage backends into something that always works. Some backends might not even be able to report progress at all.
I'd need more time than I currently have to add this, but if anyone else wants to work on it right away, I am happy to assist.
As for your fixing your underlying issue, do you know if the backup fails at compression or at upload? It seems people that want to backup rather large amounts of data often struggle using this tool.
I have a similar problem. I've succeeded in uploading about 4GB of data to Cloudflare R2. However, while uploading, other services couldn't be reached. (with 502, 530, and timeout) As I'm still discovering the core reason, I appreciate your advice.
ref: https://github.com/5ouma/homelab