Fix data processing robustness issues
Currently, we have a data processing action that runs weekly / on request and leaves an artifact with the dynamic data (Google Scholar, curated resources, tenzing). That data is then used by deploy actions.
However, if that action fails to upload an artefact, the deploy action has an incomplete fallback which leads to missing data. That rendundancy is not ideal. We need to change the deploy action to trigger the data processing if the artefact is missing/cannot be loaded, and then try again to get the artefact. If that still fails, the action should error. Same for the action that consolidates PRs for staging (staging-aggregate).
Also, the data processing action currently tries to commit the created files to a new branch, yet that fails due to an issue with untracked files. That needs to be reviewed so that all files that are included in the artefact are force-pushed into that branch.
🤖 Hi @LukasWallrich, I've received your request, and I'm working on it now! You can track my progress in the logs for more details.
Should be fixed in #537