Pre-allocation of producers
As far as I understood, the constructor:
ConcurrentQueue(size_t minCapacity, size_t maxExplicitProducers, size_t maxImplicitProducers)
uses the maxExplicitProducers and maxImplicitProducers to set the initial number of blocks. However, it does not preallocate producers (via add_producer). This means that try_enqueue can still malloc a producer when it is called.
Could the constructor mentioned above also include the following snippet in its body to minimise the chance of malloc when using try_enqueue?
for(size_t i = 0; i != maxImplicitProducers; ++i) {
auto* p = add_producer(create<ImplicitProducer>(this));
p->inactive.store(true, std::memory_order_relaxed);
}
I just realized that creating a lot of idle producers has a performance impact, as the list of producers to recycle is the same as the list of producers traversed during dequeuing.
That is correct. I believe this is the only place try_enqueue may allocate.
You can use producer tokens to force producers to be created.
You can use producer tokens to force producers to be created.
Unfortunately, this is not always possible. In my case, I can push from different threads, but I don't know beforehand which one will be used. I would have to use something like thread_local to save the producer to use, which comes with its own issues.
Alternatively, I was thinking - to avoid paying the O(N) price of traversing an oversized producerListTail on pop, I could have a secondary linked list (let's call it reservedListTail) of implicit producers that gets filled at construction time and only gets dequeued and transferred to producerListTail in recycle_or_create_producer if the producerListTail does not have any inactive producers remaining. This way, this reservedListTail is invisible during dequeue(...) and we avoid paying the cost of pre-allocating more producers than the ones actually needed.
Would this make sense? I understand this might be too niche of a use case for this library, so I would probably just implement this separate list for my own project.