Gracefully closing the crawler with keepalive flag true
Which package is the feature request for? If unsure which one to select, leave blank
@crawlee/core
Feature
I'm using puppeteer crawler with keepAlive as true and crawler.run() (without await).
This runs the crawler infinitely and if I insert new requests to the requests queue, they get processed.
(I'm using non persisted request queue)
What I want is to gracefully close the crawler, as in If I get a signal to close, I want to process all the pending requests in the requests queue first and then kill the crawler.
Motivation
Right now If I do crawler.teardown(), it abruptly closes the crawler instances without processing the pending requests.
Ideal solution or implementation, and any additional constraints
Crawler to have method which would resolve if all requests processed and then kill the instance.
Alternative solutions or implementations
No response
Other context
No response
@B4nan any update?