rascal icon indicating copy to clipboard operation
rascal copied to clipboard

FEATURE: Add NodeJS Streams support to subscriptions

Open JesseDocken opened this issue 1 year ago • 0 comments

NodeJS has the concept of a Stream (i.e., Readable and Writable) which allows for processing a stream of data in an efficient manner, supporting transformations, multiplexing, and backpressure. This would be useful for easily writing queue subscriptions that, for example, read in messages, perform some transformations, then publish them to a separate queue, or write them to a database.

Detailed Description

I would propose offering a function on the broker called subscribeStream(key) that returns an object-mode Readable of an object in the form of:

{
  message: Message,
  content: string | Buffer | any,
  ackOrNak: AckOrNack
}

This can then be piped to downstream consumers for further processing and eventually piped to a terminator that calls the ackOrNak function successfully. The pipes would need to implement error handlers that call the ackOrNak() with the appropriate error and remediation strategy, but overall the usage would be identical to existing consumers of the subscribe() API.

Context

We're piping data from RabbitMQ into our ElasticSearch cluster after doing some post-processing to make it more indexable. The ElasticSearch client supports reading from a stream to do bulk inserts into the database, which would simplify a lot of our processing, and overall the architecture of the program would be more straightforward if we could model this as a series of data streams, from the subscription to the database.

I'm certain this would be beneficial to other users who similarly use APIs that support ingesting data from streams, since this is a major feature of NodeJS and pretty prevalent for backend servers, which this library generally targets.

Possible Implementation

I'm not familiar enough with the internals of Rascal to suggest one, unfortunately.

JesseDocken avatar Apr 29 '24 22:04 JesseDocken