Doug
Doug
We can replicate this issue, it seems to fail approximately 25% of the time for us. Something similar seems to have been flagged here https://github.com/GoogleChrome/lighthouse/issues/11537. with a response of "retry...
If you've got a dump that is that kind of size, you probably want to rerun it with `SPX_SAMPLING_PERIOD` set to something reasonably high. That should dramatically reduce the size...
An interesting set of constraints! I see what you mean about there being no clear way of dealing with this, but I have a couple of ideas! As I see...
From my playing about I *think* it looks like GitHub is only reading the last 1k commits from a given repository. My outlandishly disgusting hack to resolve this is to...