postgres-new icon indicating copy to clipboard operation
postgres-new copied to clipboard

Deploy to Supabase Fails with Large CSV's "memory access out of bounds"

Open danrasmuson opened this issue 10 months ago • 2 comments

Bug report

Describe the bug

For large CSV's I'm getting this memory access out of bounds when deploying the database to supabase

To Reproduce

  1. Upload Sam.gov ContractOpporunitiesFullCSV (207 MB) https://drive.google.com/file/d/1_DaYluzfR4t_n4HkigHLqhlGTQ1Wy7_K/view?usp=sharing
  2. Click deploy to supabase
  3. Confirm you received memory error

Expected behavior

Expecting to upload to supabase

Screenshots

Image

System information

  • OS: Macos
  • Browser (if applies) Chrome

danrasmuson avatar Mar 18 '25 17:03 danrasmuson

Hey @danrasmuson, thanks for reporting. The deploy logic does a pg_dump under the hood, so my best guess with this is that pg_dump executes a query with a response that is too large for PGlite's WASM memory to handle.

This might not be a trivial problem to solve. Could you try manually running a dump on the browser DB (top right -> Download), and let me know if this works? If so you could manually restore this dump as an interim solution

gregnr avatar Mar 20 '25 21:03 gregnr

Hi @gregnr,

When I do top right -> Download with this database I get this error.

Image

Is it possible to stream or chunk the pg_dump process?

danrasmuson avatar Mar 21 '25 20:03 danrasmuson