Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigQuery: Add "Forbidden: 403 Exceeded rate limits" to list of retryable exceptions #6434

Closed
bencaine1 opened this issue Nov 7, 2018 · 6 comments
Assignees
Labels
api: bigquery Issues related to the BigQuery API. backend type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.

Comments

@bencaine1
Copy link

bencaine1 commented Nov 7, 2018

OS: Linux dc32b7e8763a 4.9.0-6-amd64 #1 SMP Debian 4.9.82-1+deb9u3 (2018-03-02) x86_64 x86_64 x86_64 GNU/Linux
Python version: Python 2.7.6
google-cloud-bigquery: 1.6.0

We're getting the following error:

Forbidden: 403 Exceeded rate limits: too many concurrent queries for this project. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors

In _should_retry, it should retry if isinstance(exc, Forbidden) and 'Exceeded rate limits' in str(exc).

@tseaver tseaver added type: question Request for information or clarification. Not an issue. api: bigquery Issues related to the BigQuery API. backend labels Nov 7, 2018
@tseaver
Copy link
Contributor

tseaver commented Nov 7, 2018

@bencaine1 Thanks for the report!

@tswast Can you please figure out with the back-end team why they would return a 403 here, rather than the expected 429? RFC 2616 specifically states about a 403:

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

Versus what it RFC 6585 for 429:

The 429 status code indicates that the user has sent too many requests in a given amount of time ("rate limiting").
The response representations SHOULD include details explaining the condition, and MAY include a Retry-After header indicating how long to wait before making a new request.

@shollyman
Copy link
Contributor

The short answer for why ratelimit pushback is communicated back using 403 vs 429 is that the behavior
in BigQuery predates the updated RFC6585. I've filed an internal issue (119203044) with the team to consider modifications to this behavior.

@tswast
Copy link
Contributor

tswast commented Jun 28, 2019

FYI: internal issue 119203044 was closed as wontfix, as this would be a breaking change. The client-side workaround would be to look for "Exceeded rate limits" and retry in that case. (Also, it might actually make sense to always retry on 403 to give the auth library a chance to refresh credentials.)

@tswast tswast added type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design. and removed type: question Request for information or clarification. Not an issue. labels Jun 28, 2019
@yan-hic
Copy link

yan-hic commented Sep 9, 2019

@twast any traction on this ? We have implemented our own resubmit logic for loads but we are now hitting a lot of quota limits on queries too.
Looks like job.result() could be the place to resubmit (and change the jobId).

@tswast
Copy link
Contributor

tswast commented Sep 9, 2019

Since this requires retrying multiple API requests (not just the "get query results" request that is actually raising the error), this feature request will require careful design work before we can proceed with implementation.

CC @crwilcox

@tswast
Copy link
Contributor

tswast commented Nov 12, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the BigQuery API. backend type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.
Projects
None yet
Development

No branches or pull requests

5 participants