Feature/lazy blob reading
Wow, work inside jszip was oldschool. Tried to implement an other feature and i notices that files where read as uint8array immediately all at once, that is really bad. I moved that part into the dataworker
would possible fix an issue like #343, #530
Really think you should consider dropping support for older platforms. Babel would have been something!
Anyhow what this PR dose is: it just passes blob/files along in prepareContent instead of reading all of it as a buffer all at once and then let DataWorker.js read the content one bit at the time when it's needed
I tested this in the context of my Application. Compressing a single large file (512 MiB, 1 GiB, 2 GiB and 4 GiB) worked as expected. (There are no "multiple files per zip" in my context.)
I really would love to see this being merged.
Has this been merged now? I was of the impression that the workers were working in chunks?
nope, but if you like you can try out a lightweight version (< 200 lines) of my zipping utility if you like. demo (view source to see how to use) it is based on ReadableStreams and don't use as nearly as much memory as jszip dose, guess it's a bit faster too.
Another zip library i'm working on is: https://github.com/transcend-io/conflux it's also based on whatwg stream but is a more complete zip reader/writer benchmark: https://conflux.netlify.com/benchmark
is based on ReadableStreams and don't use as nearly as much memory as jszip dose, guess it's a bit faster too.
Another zip library i'm working on is: https://github.com/transcend-io/conflux it's also based on whatwg stream but is a more complete zip reader/writer benchmark: https://conflux.netlify.com/benchmark
Hey @jimmywarting, this benchmark link doesn't work!, do you have the correct URL?
ping @Stuk can you take a look at this?