reactphp-mq icon indicating copy to clipboard operation
reactphp-mq copied to clipboard

Use 'spawn delay' to gradually reach indicated concurrency level

Open holtkamp opened this issue 7 years ago • 3 comments

Currently when assigning a Queue 1000 tasks with a concurrency of 50, the first 50 items in the Queue will be "pending" almost "at once". Would it be possible to add a delay between start processing these items, for example with a delay of 0.1 seconds until the concurrency level has been reached?

The idea is: eventual concurrency of 50 is ok, but the work should be started gradually instead of "all at once until maximum concurrency is reached".

holtkamp avatar May 31 '18 14:05 holtkamp

Thank you for bringing this up! The decision to immediate start processing is indeed by design and I think this makes sense for a large number of use cases.

That being said, I agree that there are a number of use cases where you want to throttle operations with some some kind of delay between each start. I've already toyed with the idea and I think that a https://en.wikipedia.org/wiki/Token_bucket implementation would be a nice addition to this library :+1:

I'm not currently working on this, but any input is appreciated. If you want to help with developing or sponsoring development, reach out and I'm happy to help! :shipit:

clue avatar Jun 02 '18 18:06 clue

Thanks for answering. The https://en.wikipedia.org/wiki/Token_bucket approach is new for me, will read it. The article refers to https://en.wikipedia.org/wiki/Leaky_bucket#As_a_meter, which I did read a few months ago, interesting stuf!

If I understand correctly this library follows the "Leaky bucket as a Queue" approach, right?

I think for now the use of a reasonable "concurrency" suffices for most use cases. At least we got this aspect now documented. Maybe someone will need it in the future

holtkamp avatar Jun 02 '18 20:06 holtkamp

If I understand correctly this library follows the "Leaky bucket as a Queue" approach, right?

Correct :+1: The main difference is that the leaky bucket allows you to limit how many things happen at one time (limits concurrency). The token bucket allows you to control how many things are allowed to be started within a certain period of time (limits average rate with optional bursts).

Of course, both can be combined to express requirements like "start no more than 10 requests per 60s and run no more than 2 requests concurrently".

clue avatar Jun 03 '18 10:06 clue