Async PG Fix v2
@levkk did all the heavylifting, just merged with main and tested out with both python and node async clients. also removed a redundant variable and method
can fix #396
it's works for me. Can you fix tests?
it's works for me. Can you fix tests?
not sure if it's a problem in code, getting this in the test runs, actual tests didn't even start running
I am trying this with the example script from pg-query-stream: https://www.npmjs.com/package/pg-query-stream
const pg = require('pg')
var pool = new pg.Pool()
const QueryStream = require('pg-query-stream')
const JSONStream = require('JSONStream')
//pipe 1,000,000 rows to stdout without blowing up your memory usage
pool.connect((err, client, done) => {
if (err) throw err
const query = new QueryStream('SELECT * FROM generate_series(0, $1) num', [1000000])
const stream = client.query(query)
//release the client when the stream is finished
stream.on('end', done)
stream.pipe(JSONStream.stringify()).pipe(process.stdout)
})
If I connect directly to Postgres, I see output immediately, as expected (ignore the error, that's just because there is no error handler in the sample script):
$ time node ./ | head
[
{"num":0}
,
{"num":1}
,
{"num":2}
,
{"num":3}
,
{"num":4}
node:events:492
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at afterWriteDispatched (node:internal/stream_base_commons:160:15)
at writeGeneric (node:internal/stream_base_commons:151:3)
at Socket._writeGeneric (node:net:952:11)
at Socket._write (node:net:964:8)
at writeOrBuffer (node:internal/streams/writable:447:12)
at _write (node:internal/streams/writable:389:10)
at Writable.write (node:internal/streams/writable:393:10)
at Stream.ondata (node:internal/streams/legacy:20:31)
at Stream.emit (node:events:514:28)
at drain (/Users/smcgivern/Code/forks/pgcat/tests/node/node_modules/through/index.js:36:16)
Emitted 'error' event on Socket instance at:
at Socket.onerror (node:internal/streams/legacy:62:12)
at Socket.emit (node:events:514:28)
at emitErrorNT (node:internal/streams/destroy:151:8)
at emitErrorCloseNT (node:internal/streams/destroy:116:3)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
errno: -32,
code: 'EPIPE',
syscall: 'write'
}
Node.js v20.9.0
real 0m0.209s
user 0m0.040s
sys 0m0.020s
If I connect to PgCat running this commit (having merged in main), it simply never terminates or logs any items, even with a single row. Given that this works fine with PgBouncer, I suspect there's something not quite right here.
I am trying this with the example script from
pg-query-stream: https://www.npmjs.com/package/pg-query-streamconst pg = require('pg') var pool = new pg.Pool() const QueryStream = require('pg-query-stream') const JSONStream = require('JSONStream') //pipe 1,000,000 rows to stdout without blowing up your memory usage pool.connect((err, client, done) => { if (err) throw err const query = new QueryStream('SELECT * FROM generate_series(0, $1) num', [1000000]) const stream = client.query(query) //release the client when the stream is finished stream.on('end', done) stream.pipe(JSONStream.stringify()).pipe(process.stdout) })If I connect directly to Postgres, I see output immediately, as expected (ignore the error, that's just because there is no error handler in the sample script):
$ time node ./ | head [ {"num":0} , {"num":1} , {"num":2} , {"num":3} , {"num":4} node:events:492 throw er; // Unhandled 'error' event ^ Error: write EPIPE at afterWriteDispatched (node:internal/stream_base_commons:160:15) at writeGeneric (node:internal/stream_base_commons:151:3) at Socket._writeGeneric (node:net:952:11) at Socket._write (node:net:964:8) at writeOrBuffer (node:internal/streams/writable:447:12) at _write (node:internal/streams/writable:389:10) at Writable.write (node:internal/streams/writable:393:10) at Stream.ondata (node:internal/streams/legacy:20:31) at Stream.emit (node:events:514:28) at drain (/Users/smcgivern/Code/forks/pgcat/tests/node/node_modules/through/index.js:36:16) Emitted 'error' event on Socket instance at: at Socket.onerror (node:internal/streams/legacy:62:12) at Socket.emit (node:events:514:28) at emitErrorNT (node:internal/streams/destroy:151:8) at emitErrorCloseNT (node:internal/streams/destroy:116:3) at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { errno: -32, code: 'EPIPE', syscall: 'write' } Node.js v20.9.0 real 0m0.209s user 0m0.040s sys 0m0.020sIf I connect to PgCat running this commit (having merged in
main), it simply never terminates or logs any items, even with a single row. Given that this works fine with PgBouncer, I suspect there's something not quite right here.
Works for me somewhat in session mode instead of transaction mode, however if I run a synchronous query after performing async queries I get DEALLOCATE ALL instead of my database rows, and then afterwards it randomly switches between that and SET/RESET for a response. As far as log output, I only see this
2024-08-15T18:29:40.127180895Z 2024-08-15T18:29:40.126615Z INFO ThreadId(05) pgcat::server::cleanup: Server returned with session state altered, discarding state (SET: true, PREPARE: true) for application pgcat
2024-08-15T18:29:52.105634825Z 2024-08-15T18:29:52.105240Z INFO ThreadId(07) pgcat::server::cleanup: Server returned with session state altered, discarding state (SET: true, PREPARE: false) for application pgcat
2024-08-15T18:29:56.960420756Z 2024-08-15T18:29:56.959998Z INFO ThreadId(07) pgcat::server::cleanup: Server returned with session state altered, discarding state (SET: true, PREPARE: false) for application pgcat
2024-08-15T18:30:19.725779443Z 2024-08-15T18:30:19.725516Z INFO ThreadId(07) pgcat::server::cleanup: Server returned with session state altered, discarding state (SET: true, PREPARE: false) for application pgcat