Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AsyncInferenceClient.chat_completion streaming is broken with TGI 2.2.0 #2455

Closed
cordawyn opened this issue Aug 18, 2024 · 4 comments · Fixed by #2458
Closed

AsyncInferenceClient.chat_completion streaming is broken with TGI 2.2.0 #2455

cordawyn opened this issue Aug 18, 2024 · 4 comments · Fixed by #2458
Labels
bug Something isn't working

Comments

@cordawyn
Copy link

cordawyn commented Aug 18, 2024

Describe the bug

When using AsyncInferenceClient.chat_completion(stream=True) the client expects the endpoint to emit [DONE]\n token at the end of the stream (which TGI does). However, due to the differences in the implementations of token parsing between InferenceClient and AsyncInferenceClient, line

if byte_payload == b"data: [DONE]":
receives either [DONE] (regular client) or [DONE]\n (async client). Naturally, async client fails the check (due to the extra "\n") and submits "[DONE]\n" for further JSON parsing (which fails).

Suggested fix is trivial: calling .rstrip() for byte_payload in

.

Unfortunately, I cannot provide a merge request for this, because the unit tests involve auto-generated (?) cassettes that do not have [DONE] tokens. Doesn't look like a good idea to edit them manually (although I tried it, and that reproduced the bug).

Reproduction

import asyncio
from huggingface_hub import AsyncInferenceClient

async def test_async_client():
    # assuming TGI is running locally
    client = AsyncInferenceClinet(base_url="127.0.0.1")
    output = await client.chat_completion(
        [
            { "role": "user", "content": "Hello!" }
        ],
        stream=True)
    async for token in otuput:
        print(token)

asyncio.run(test_async_client())

or edit https://github.com/huggingface/huggingface_hub/blob/e9cd695d7bd9e81b4eceb8f4da557a0cfa387b99/tests/cassettes/test_async_chat_completion_with_stream.yaml by adding "[DONE]" to the end of the body and run https://github.com/huggingface/huggingface_hub/blob/e9cd695d7bd9e81b4eceb8f4da557a0cfa387b99/tests/test_inference_async_client.py test (test_async_chat_completion_with_stream will fail).

Logs

No response

System info

- huggingface_hub version: 0.25.0.dev0
- Platform: Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35
- Python version: 3.10.0
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: /home/cordawyn/.cache/huggingface/token
- Has saved token ?: False
- Configured git credential helpers:
- FastAI: N/A
- Tensorflow: N/A
- Torch: N/A
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.4.0
- hf_transfer: N/A
- gradio: 4.26.0
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: 2.8.2
- aiohttp: 3.9.1
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /home/cordawyn/.cache/huggingface/hub
- HF_ASSETS_CACHE: /home/cordawyn/.cache/huggingface/assets
- HF_TOKEN_PATH: /home/cordawyn/.cache/huggingface/token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
@cordawyn cordawyn added the bug Something isn't working label Aug 18, 2024
@Wauplin
Copy link
Contributor

Wauplin commented Aug 19, 2024

Thanks for reporting @cordawyn! I've applied the suggested change in #2458 + fixed the tests accordingly. FYI, to (re-)record a cassette you can delete the file and relaunch the test locally. These cassettes are not ideal (we missed a bug a because of them) but that will be for another PR.

@Wauplin
Copy link
Contributor

Wauplin commented Aug 19, 2024

The hot-fix release has been shipped: https://github.com/huggingface/huggingface_hub/releases/tag/v0.24.6. Upgrading huggingface_hub to the latest version should fix your problem :)

@cordawyn
Copy link
Author

@Wauplin Thank you for the quick fix! I hope I'll be able to contribute in a more productive manner next time! 😉

@Wauplin
Copy link
Contributor

Wauplin commented Aug 19, 2024

Reporting it and immediately suggesting a fix is already impactful! That was quick thanks to you 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants