-
Notifications
You must be signed in to change notification settings - Fork 531
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ModelInfo bug #2186
Comments
it happened to me as well while working (no changes in dependencies/code etc.) |
Thanks for reporting @Narsil @kresimirfijacko! Will check if this is a breaking change server side and open a PR to fix it client side anyway. |
yeah it's probably something related server side
all of a sudden stopped working |
Hey I have the same issue, this started happpening today around 3PM. I cannot use vllm server nor the |
work around,add sharded: None in the file hf_api.py,like this : |
+1 |
|
Just have the same issue. It was working earlier today. docker run --runtime nvidia --gpus all \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HUGGING_FACE_HUB_TOKEN=<secret>" \
-p 8000:8000 \
--ipc=host \
vllm/vllm-openai:latest \
--model TheBloke/Mixtral-8x7B-Instruct-v0.1-AWQ --quantization awq --tensor-parallel-size 4 |
Same issue here too. Started today. |
i think the server side is fixed,so this work around is deprecated.
|
Hey everyone, thanks for quickly reporting issues and suggesting a workaround. Failure is indeed due to a server-side change and we are discussing solutions to mitigate it. In the meantime, I opened #2190 to fix the issue client-side (which will make the class future-proof). To get an immediate fix, please install from this branch: pip install git+https://github.com/huggingface/huggingface_hub@2186-fix-safetensors-info EDIT: no need to install a new version of |
How can I re-upload my model after a 24h training process die because of this bug? |
+1 |
Thank you @Wauplin, as a reminder you can use this syntax to include optional dependencies
EDIT: no need to install a new version of |
Is this fixed? I still get this error :( |
A fix has been deployed a few minutes ago. This should be fixed for everyone without updating any dependencies. Sorry again for the inconvenience and thanks everyone for your reactivity on this 🤗 |
I got error: git checkout -q 2186-fix-safetensors-info did not run successfully. |
Yes sorry, PR has been merged and is now on |
Describe the bug
Cannot load model information on some repo.
Reproduction
Logs
The text was updated successfully, but these errors were encountered: