Reduce the size of `cron_job.py` instance.
Now it's ~40MB. Which consumes too much memory. Should be much less.
Creating a new Python process will still have a memory footprint measured in tens of megabytes. We can try to reduce that footprint by switching to forking (as in os.fork()) and rely on COW memory, but it would still be hard to control memory usage.
As an alternative, I can suggest switching to concurrent.futures module with a ProcessPoolExecutor or a ThreadPoolExecutor, which will allow us to control how many tasks can be executed simultaneously.
Later we can also switch to async/await statements to reduce the footprint even more.
@Fillll would love to hear your feedback on this :)
Sounds like concurrent.futures could be a good solution, at least in short term, as we are losing some postings now.
Just to be on the same page. We want to fit 999MB instance with all python processes and DB instances on the same machine.
@lgyanf, please go ahead and create PR. Thank you!
Since merged, consider it done.