The Intermediate checkpoint can't be load by from_pretrained during Fine tune Whisper #1753
Unanswered
shuaijiang
asked this question in
Q&A
Replies: 1 comment
-
Hi @shuaijiang, having the same issue. have you found any solution?? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I fine tune the whisper using code https://github.com/shuaijiang/Whisper-Finetune/blob/master/finetune_all.py
I save the last 5 checkpoint by setting save_total_limit=5 in the Seq2SeqTrainingArguments
To do the inference, I load the Intermediate model by WhisperForConditionalGeneration.from_pretrained(ckpt_path) , it says Some weights of the model checkpoint at [checkpoint_path] were not used when initializing WhisperForConditionalGeneration: [list of weights]. And it doesn't work, the model only generates 0
However, When I load the final model, which saved by model.save_pretrained(), It works.
Thanks for any advice!
Beta Was this translation helpful? Give feedback.
All reactions