Store analysis to embedded K-V store
Add a possibility to store the analysis into some fast embedded key-value store instead of memory. E.g. BadgerDB. The analysis would be of course slower a lot but it will use very little memory and the result could be opened again very fast.
i can do tests with large datasets.
This is an awesome tool, thanks so much!
This db feature would be very useful for periodic scans of very large servers. duc is great for this, but it's way slower to scan than gdu on my server (as expected, based on your excellent benchmark data!). Once the scan is complete though, duc is much snappier to open/explore the result (stored in a Tokyocabinet database) than gdu (stored in a JSON), again, as expected.
This may also enable #150 - allowing users to view partially scanned dir trees if the scan aborts or is stopped. Restarting a scan from an aborted/stopped scan would also be amazing, but maybe tricky to do.
I would also find it useful to be able to merge/join two databases. Sometimes I find that the best way to reduce resource usage and scan time is to just take /large_dir/ and instead of scanning it all in one go, scan /large_dir/subdir1/ and then /large_dir/subdir2/ separately. But then those two subdirs have to be explored separately. If those two resulting databases could be combined (so that you can just interactively browse/large_dir/ all at once), that would be a really cool feature.
First implementation released in https://github.com/dundee/gdu/releases/tag/v5.26.0 Please try it. All feedback appreciated.