node-zip icon indicating copy to clipboard operation
node-zip copied to clipboard

process out of memory

Open SamDecrock opened this issue 8 years ago • 3 comments

Hi,

I'm processing some big files and I'm getting a "process out of memory" error.

 - generating zip
 - FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
 - <--- Last few GCs --->
 -   166977 ms: Mark-sweep 181.8 (349.4) -> 164.6 (338.4) MB, 39.8 / 0 ms (+ 110.6 ms in 280 steps since start of marking, biggest step 9.0 ms) [allocation failure] [GC in old space requested].
 -   167072 ms: Mark-sweep 164.6 (442.5) -> 130.6 (382.5) MB, 95.1 / 0 ms [allocation failure] [GC in old space requested].
 -   167153 ms: Mark-sweep 130.6 (486.7) -> 129.8 (483.7) MB, 80.8 / 0 ms [last resort gc].
 -   167214 ms: Mark-sweep 129.8 (483.7) -> 128.6 (483.7) MB, 61.0 / 0 ms [last resort gc].
 - <--- JS stacktrace --->
 - ==== JS stack trace =========================================
 - Security context: 0x3c5bdb2b4629 <JS Object>
 -     1: Join(aka Join) [native array.js:133] [pc=0x167ded90d3a7] (this=0x3c5bdb2041b9 <undefined>,o=0x241786009bf9 <JS Array[250]>,v=250,C=0x3c5bdb204291 <String[0]: >,B=0x3c5bdb295289 <JS Function ConvertToString (SharedFunctionInfo 0x3c5bdb24a029)>)
 -     2: InnerArrayJoin(aka InnerArrayJoin) [native array.js:331] [pc=0x167dedd83f6a] (this=0x3c5bdb2041b9 <undefined>,C=0x3c5bdb204291 <String[0]: ...
 - error: Forever detected script was killed by signal: SIGABRT

Obviously, I'm out of memory because I'm trying to zip too much :-) Can the module be told to write directly to disk (stream)?

Thanks, Sam

SamDecrock avatar Sep 22 '17 05:09 SamDecrock

@SamDecrock Did you ever manage to find a memory efficient way to write zip files in node?

vladbalan avatar Nov 16 '18 09:11 vladbalan

@vladbalan Yes!

I'm currently using the archiver module. It allows me to write directly to disk. This way only disk space is used, not RAM.

Sam

SamDecrock avatar Nov 16 '18 15:11 SamDecrock

@SamDecrock Thanks! I was actually testing archiver right now. I just compressed 100 files of 10MB each and no memory usage spikes! Good stuff. :)

vladbalan avatar Nov 16 '18 16:11 vladbalan