Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix unclosed aiohttp.ClientResponse objects #2528

Merged
merged 2 commits into from
Sep 10, 2024

Conversation

Wauplin
Copy link
Contributor

@Wauplin Wauplin commented Sep 9, 2024

Fix #2521.

Follow-up PR after #2496. Closing sessions when exiting the AsyncInferenceClient is not enough to get rid of all warnings. The ClientResponse objects also have to be closed explicitly when in case they are not consumed entirely. This PR fixes this by registering all pending responses that have to be closed when a ClientSession is closed.

It was a transient error depending on when the garbage collector was triggered, making it difficult to investigate. Many thanks to @tomjorquera who provided a reproducible example. I used it to run over 100 examples (with a progress bar) and it seems resolved:

# on 'main' branch
  3%|██▏                                                                    | 3/100 [00:02<01:08,  1.42it/s]Unclosed connection
client_connection: Connection<ConnectionKey(host='api-inference.huggingface.co', port=443, is_ssl=True, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=4764392330092113202)>
 19%|█████████████▎                                                        | 19/100 [00:12<00:49,  1.63it/s]Unclosed connection
client_connection: Connection<ConnectionKey(host='api-inference.huggingface.co', port=443, is_ssl=True, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=4764392330092113202)>
Unclosed connection
client_connection: Connection<ConnectionKey(host='api-inference.huggingface.co', port=443, is_ssl=True, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=4764392330092113202)>
Unclosed connection
client_connection: Connection<ConnectionKey(host='api-inference.huggingface.co', port=443, is_ssl=True, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=4764392330092113202)>
 29%|████████████████████▎                                                 | 29/100 [00:18<00:46,  1.54it/s]
(...)
# with this PR
 100%|█████████████████████████████████████████████████████████████████████| 100/100

I've also added a test to ensure responses are closed when a session is closed. (and thanks to #2521, the sessions are closed when the AsyncInferenceClient is closed)

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@hanouticelina hanouticelina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @Wauplin! before this PR, I thought that the ClientResponse objects are closed implicitly when closing the ClientSession, but it seems that aiohttp allows streaming responses to outlive sessions. Nice 👍

@Wauplin
Copy link
Contributor Author

Wauplin commented Sep 10, 2024

Same here, I wouldn't have expected them to outlive sessions. Most of the time, those objects are meant to be used in context managers which we don't do here for practical reasons. This is why we are facing those corner cases.
(btw, using httpx for both InferenceClient and AsyncInferenceClient would have been much easier to deal with instead of requests+aiohttp. We've not done it for consistency with other parts of the lib' but we can think of a switch in the future if we really see a need).

In the meantime, let's get this merged :)

@Wauplin Wauplin merged commit 90c08d6 into main Sep 10, 2024
18 of 19 checks passed
@Wauplin Wauplin deleted the 2521-fix-unclosed-async-responses branch September 10, 2024 06:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

AsyncInferenceClient not closing the answer properly when response stream is not entirely consumed
3 participants