-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
In the code, when enabling the LoRA the freeze_llm should be set to False. Shouldn't it be the opposite? Like in your other repository including QwenVL-Finetune.
assert not (training_args.lora_enable and training_args.freeze_llm), 'When using LoRA, the LLM should not be frozen. If you want to freeze the LLM, please disable LoRA.'
Constrastingly, in the QwenVL repo
raise ValueError("If `lora_enable` is True, `freeze_llm` must also be True.")
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels