Daniel Durante
Daniel Durante
I haven't tried it in a long time / moved on to other graph databases that are fitting our needs. I no longer have the CSV files but the lengths,...
@crephix what's the generated model that's throwing this error?
Can you re-submit without the vendor changes please?
@mwoss any luck on this issue?
:+1: was looking for this myself today (using go-bindata)
For the most part I've moved on to golang/rust world but I can definitely update this repo .. really the only major difference is having a `tasks` folder and setting...
@kmansoft submitted a PR that should fix your issue. In the mean time, my current workaround was to acquire the pool manually for each query and then release it manually.
I was using pgxpool with Query and QueryRow and still hitting the same problem as you (also noted about the 10 vs 20 goroutines on my own so our experience...
@kmansoft my current working solution is to drop down to pgx's `stdlib` and use `stdlib.AcquireConn(*sql.DB)` within goroutines (then defer close that connection). So far this has been working well in...
For the record, I used max pool settings this entire time (ranged from 20 - 400). Checked long-living queries [> 5 minutes] (there were none). ``` The underlying query started...