fast-csv icon indicating copy to clipboard operation
fast-csv copied to clipboard

[BUG] CSV Streams severely limited in capacity

Open Komefumi opened this issue 3 years ago • 2 comments

Describe the bug It's found that a csv stream hits a limit in how much data it can hold.

A csv stream is found to be severely limited in the amount of data it can hold, when the need might be to hold a large amount of string content.

For instance consider this piece of code running:

await csvStream.write(stringContent);

In a loop, meant to generate the total content to create a csv file that might well have content equalling a few megabytes in size.

The cut off hits very early.

In these situations, the following has shown to work:

let csvContent = '';
csvStream.on('data',  (row) => {
  csvContent += row;
});

Having a limit to the csv stream is understandable but I couldn't find anything in regards to this in the documentation... I think it would be sufficient if this caveat is mentioned and a solution (such as the one above) is suggested in those circumstances

Komefumi avatar Nov 26 '22 02:11 Komefumi

Describe the bug It's found that a csv stream hits a limit in how much data it can hold.

A csv stream is found to be severely limited in the amount of data it can hold, when the need might be to hold a large amount of string content.

For instance consider this piece of code running:

await csvStream.write(stringContent);

In a loop, meant to generate the total content to create a csv file that might well have content equalling a few megabytes in size.

The cut off hits very early.

In these situations, the following has shown to work:

let csvContent = '';
csvStream.on('data',  (row) => {
  csvContent += row;
});

Having a limit to the csv stream is understandable but I couldn't find anything in regards to this in the documentation... I think it would be sufficient if this caveat is mentioned and a solution (such as the one above) is suggested in those circumstances

  • [x] https://github.com/settings/apps/new

zxramozx avatar May 09 '24 08:05 zxramozx

@zxramozx @doug-martin @dustinsmith1024

I can reproduce this issue. It seems it is not possible to create CSV files bigger than 64 kilobytes.

Please take a look at this demo script to reproduce: https://github.com/pahund/fast-csv-bug

pahund avatar Aug 09 '24 07:08 pahund