Finetuning Whisper ASR Model for Translation Tasks #1343
Unanswered
kikozi2000
asked this question in
Q&A
Replies: 3 comments
-
i haven't done any fine-tuning, but i think you can just follow the official guide, when you load the tokenizer, you change to |
Beta Was this translation helpful? Give feedback.
0 replies
-
@kikozi2000 Hey, did you try fine tuning it for translation tasks? |
Beta Was this translation helpful? Give feedback.
0 replies
-
At this point, just use NLLB-200 for translation, don't use whisper |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I'm interested in finetuning the Whisper ASR model to perform translation tasks. Has anyone here tried this before? If so, could you provide some guidance on the required dataset format and the process of training the model for this purpose? Any insights or experiences shared would be greatly appreciated.
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions