hypercorn with several workers
I would like to start my quart application with several workers, but this is not so easy in connection with aioprometheus, because each worker has its own state.
In the official prometheus client library this can be solved like this: https://github.com/prometheus/client_python#multiprocess-mode-eg-gunicorn
How can I solve this?
I think I understand the problem you are trying to solve. The aioprometheus package does not support using it like how you point out the official prometheus client library (which I think looks pretty complicated) accommodates this problem. I've simply exposed the metrics from each worker when I've run a FastAPI server with uvicorn (similar to what you are doing with Quart and Hypercorn). Like you noticed, they each have their own state.
But doesn't it make the metrics unstable because so another worker keeps responding?
Any news on this? I agree with @l1f. State should be unique across all workers, since the scraping will hit once to collect all metrics in a consolidate way.
Also waiting for an update on this. Not sure if you found a way @l1f