Batch larger serialisations
Current behavior
Node has an internal string length limit, meaning that any data larger than 2GB. will fail with a string length error, when we run JSON.parse here https://github.com/LokiJS-Forge/LokiDB/blob/master/packages/loki/src/loki.ts#L687
Expected behavior
Any size of data will be eventually serialized without error
What is the motivation / use case for changing the behavior?
Allow the batch insert of large quantities of data greater than 2GB
Environment
LokiDB version: 2.1.0
Node v14.17.0
Others:
- We could add an explicit batch length like knex does in their `batchInsert` method https://knexjs.org/
- We could internal batch inserts with an arbitrary value than can be overridden in DB config e.g batches of 10,000
Am I hitting this issue here? https://github.com/nuxt/content/issues/947 would be great if that got fixed!
@lustremedia I attempted, but it's quite fundamental to how Loki works so it would be pretty significant, also note that this is somewhat unmaintained https://github.com/LokiJS-Forge/LokiDB/issues/190#issuecomment-901017586 - I can recommend the successor that I wrote that doesn't have this issue - main caveat is lack of indexing at this point (in the near future), although it performs pretty well without it https://github.com/elmarti/camadb