-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Open
Description
Hi,
I am encountering an AssertionError while running run_llava.py for model evaluation. I used this command:
python run_llava.py \
--model-path /LLaVA/llava/checkpoints/llava-2-7b-chat-task-qlora \
--model-base /LLaVA/llava/llava-v1.5-7b \
--image-file /LLaVA/fine-tuning/llava_dataset/images/097c0574-d499-4ae6-9342.jpg \
--query "Describe the image."
and it returns an error with the following stack trace:
UserWarning: Merge lora module to 4-bit linear may get different generations due to rounding errors.
Model is loaded...
Traceback (most recent call last):
File "/content/LLaVA/llava/eval/run_llava.py", line 145, in <module>
eval_model(args)
File "/content/LLaVA/llava/eval/run_llava.py", line 115, in eval_model
output_ids = model.generate(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/content/LLaVA/llava/model/language_model/llava_llama.py", line 137, in generate
return super().generate(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 1525, in generate
return self.sample(
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 2622, in sample
outputs = self(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/content/LLaVA/llava/model/language_model/llava_llama.py", line 91, in forward
return super().forward(
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/modeling_llama.py", line 1201, in forward
logits = self.lm_head(hidden_states)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/bitsandbytes/nn/modules.py", line 414, in forward
assert self.weight.shape[1] == 1
AssertionError
Has anyone encountered this similar error? And if so, how did you resolve it?
Thank you!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels