pg_net icon indicating copy to clipboard operation
pg_net copied to clipboard

Prolonged high CPU usage when the net tables grow too big

Open steve-chavez opened this issue 1 year ago • 3 comments

Problem

There was a case where a user imported data causing millions of rows to be inserted into a table that had webhooks enabled. This caused high CPU usage for a prolonged period of time.

Proposal

With an in-memory queue, it's possible to bound it to a certain size. Once this size is surpassed, we could:

  • Block the producers of http requests, until there's more capacity in the queue.
  • Log an ERROR or WARNING.

Note

Spill over to disk is not an option (inserting into another table), as it would make the usage more complex.

steve-chavez avatar Nov 06 '24 01:11 steve-chavez

I'm struggling with this exact case

riderx avatar Nov 08 '24 01:11 riderx

@riderx How many rows do you have in the net tables? I've seen this case before but I'd like to gather more data.

steve-chavez avatar Nov 08 '24 23:11 steve-chavez

@steve-chavez hello My net table once had 100,000 rows piled up, which forced me to divert some network requests and call through my own service. http1234--> pg_net http5678--> my_pg_net, AND i have a service to read my_pg_net and send http

begank avatar Nov 18 '24 06:11 begank