Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can not load local model with Auto-GPTQ #6

Open
bonuschild opened this issue Nov 1, 2023 · 2 comments
Open

Can not load local model with Auto-GPTQ #6

bonuschild opened this issue Nov 1, 2023 · 2 comments

Comments

@bonuschild
Copy link

Here is my output after executing:

(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API# python blocking_api.py
Traceback (most recent call last):
  File "/mnt/e/Downloads/AutoGPTQ-API/blocking_api.py", line 29, in <module>
    model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
  File "/root/miniconda3/envs/autogptq/lib/python3.10/site-packages/auto_gptq/modeling/auto.py", line 108, in from_quantized
    return quant_func(
  File "/root/miniconda3/envs/autogptq/lib/python3.10/site-packages/auto_gptq/modeling/_base.py", line 791, in from_quantized
    raise FileNotFoundError(f"Could not find model in {model_name_or_path}")
FileNotFoundError: Could not find model in ../models/WizardCoder-15B-1.0-GPTQ

And my self-check output under WSL2 + conda + python 3.10 following README.md in the repository itself:

(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API# pwd
/mnt/e/Downloads/AutoGPTQ-API
(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API# ls ../models/WizardCoder-15B-1.0-GPTQ/
README.md          generation_config.json  quantize_config.json     tokenizer_config.json
added_tokens.json  merges.txt              special_tokens_map.json  vocab.json
config.json        model.safetensors       tokenizer.json
(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API# grep -rn "model_name_or_path" blocking_api.py
17:#model_name_or_path = "../models/WizardCoder-15B-1.0-GPTQ"
18:model_name_or_path = "../models/WizardCoder-15B-1.0-GPTQ"
27:tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
29:model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API# pip list | grep auto-gptq
auto-gptq          0.4.2
(autogptq) root@XXX:/mnt/e/Downloads/AutoGPTQ-API#

The model is actually exists but why Auto-GPTQ can not find it?

@mzbac
Copy link
Owner

mzbac commented Nov 1, 2023

Hi @bonuschild,
This is a pretty old repository. The way Auto-GPTQ loads the model may have changed. Please refer to the similar issue in the Auto-GPTQ repository. AutoGPTQ/AutoGPTQ#133 (comment)

@bonuschild
Copy link
Author

bonuschild commented Nov 1, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants