eawer
eawer
@SGrondin tried your suggestion and unfortunately, it leads to the same issue - `FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory`
@SGrondin, unfortunately, it is still there. I've created a repo to reproduce the issue - https://github.com/eawer/bottleneck-oom-test. It fails on my machine - Ubuntu 16.04, i7, 16GB
@zhreshold yes, all of `nvcr.io/nvidia/pytorch:20.07-py3`, `nvcr.io/nvidia/pytorch:20.03-py3` and `nvcr.io/nvidia/mxnet:20.07-py3` failed. The output I posted is for `nvcr.io/nvidia/pytorch:20.07-py3`
If you mean the version of the docker itself, it's `19.03.8, build afacb8b7f0` and `19.03.12, build 48a66213fe`
I tried to load data using the latest master ``` Cayley version: v0.7.x-dev Git commit hash: 300dcf1f8503 ``` (I believe the commit you've referenced is already in there), and probably...
On another try received this: ``` fatal error: runtime: out of memory runtime stack: runtime.throw(0x150ab5f, 0x16) /usr/local/go/src/runtime/panic.go:608 +0x72 runtime.sysMap(0xc68c000000, 0x3c000000, 0x22e1738) /usr/local/go/src/runtime/mem_linux.go:156 +0xc7 runtime.(*mheap).sysAlloc(0x22c0640, 0x3c000000, 0x7f16c9bdaf80, 0x7f16a3013bc0) /usr/local/go/src/runtime/malloc.go:619 +0x1c7 runtime.(*mheap).grow(0x22c0640,...
Badger failed with such error: ``` I0821 08:20:19.862643 1 cayley.go:63] Cayley version: v0.7.x-dev (300dcf1f8503) I0821 08:20:19.862868 1 cayley.go:76] using config file: etc/cayley.json I0821 08:20:19.862959 1 database.go:193] using backend "badger" (/data/badger)...
Reducing batch size to 10k for bolt did not help a lot, process ended with no trace, because of OOM with something like this https://github.com/cayleygraph/cayley/issues/815#issuecomment-523063489 in the journal 
Badger started to import only on 1k batch size, but after all crashed like this: Long trace ``` I0821 13:46:05.629259 1 load.go:149] Wrote 91399000 quads. fatal error: runtime: out of...
`Cayley version: v0.7.x-dev (88cbd1564ad0)` badger died after 20min and 14M quads (batch size 1k) with `Error: db: failed to load data: Txn is too big to fit into one request`...