Deploy to Supabase Fails with Large CSV's "memory access out of bounds"
Bug report
Describe the bug
For large CSV's I'm getting this memory access out of bounds when deploying the database to supabase
To Reproduce
- Upload Sam.gov ContractOpporunitiesFullCSV (207 MB) https://drive.google.com/file/d/1_DaYluzfR4t_n4HkigHLqhlGTQ1Wy7_K/view?usp=sharing
- Click deploy to supabase
- Confirm you received memory error
Expected behavior
Expecting to upload to supabase
Screenshots
System information
- OS: Macos
- Browser (if applies) Chrome
Hey @danrasmuson, thanks for reporting. The deploy logic does a pg_dump under the hood, so my best guess with this is that pg_dump executes a query with a response that is too large for PGlite's WASM memory to handle.
This might not be a trivial problem to solve. Could you try manually running a dump on the browser DB (top right -> Download), and let me know if this works? If so you could manually restore this dump as an interim solution
Hi @gregnr,
When I do top right -> Download with this database I get this error.
Is it possible to stream or chunk the pg_dump process?