gapic-generator-python icon indicating copy to clipboard operation
gapic-generator-python copied to clipboard

GoogleAdsService doesn't properly apply retries when Retry is passed in

Open Kache opened this issue 3 years ago • 2 comments

Original report: https://github.com/googleads/google-ads-python/issues/597 Directed to this project by @BenRKarl

Excerpt:

GoogleAdsService provides search(), which accepts a retry: google.api_core.retry.Retry. The Retry decorator ends up only getting applied on the first request, making it effectively useless for large results with many pages.

Subsequent requests are issued by SearchPager, initialized a few lines down, where retry is not passed in, thus making it unavailable at evocation, where every SearchPager._method() call is not retry-decorated.

See original report (above) for environment details, steps to reproduce, some context background, followup questions, etc.

~~I'd also opened fix PR https://github.com/googleads/google-ads-python/pull/598, but I'm told it's code generated from this library.~~ I've opened https://github.com/googleapis/gapic-generator-python/pull/1245 that fixes Retry() for search().

Kache avatar Mar 18 '22 16:03 Kache

We're catching up on old bug reports. Is this still an issue?

vchudnov-g avatar Mar 14 '25 22:03 vchudnov-g

Yeah, probably. I've made custom fixes in my usages that's been in place for a long time now:

I'm just going to give up on this, due to lack of support from Google. Luckily, this is Python, so it's possible to peer under the interface and develop a custom workaround.

If you're here reading this, takeaways so you can implement/workaround/fix yourself:

  • The built-in retry & timeout mechanism for both search_stream() and search() is broken & unreliable
    • search_stream() - completely broken, likely cannot be mitigated at the Python level
      • retry only applies to "the returning of the non-materialized generator", i.e. useless
      • timeout wraps the entire stream, so you'll need to know the result size ahead of time
    • search() - only applies retry & timeouts to the first paginated request
  • search() can be made resilient by applying retry and timeout to every request, not just the first

With the fix, search() is far more reliable than search_stream(), which doesn't even properly handle retries and timeouts in the first place, given its streaming nature and the naive retry and timeout semantics.

Kache avatar Mar 17 '25 21:03 Kache