Just out of curiosity, does anyone know how much GPU resources and time OpenAI used to train Whisper? #1508
maowandong
started this conversation in
General
Replies: 2 comments 2 replies
-
I'm not aware of publicly available information on the GPU resources used, but the Whisper paper describes a corpus of 680,000 hours with training details for the different models given in section 2.4. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Just out of curiosity, does anyone know how much GPU resources and time OpenAI used to train Whisper?
I am fine-tuning whisper-large-v2 for chinese with 10,000+ hours corpus (15 million audio-label pair) on A100 x 8, i tooks 8-9 secons per iteration, and processing a total of 3 epochs takes 18-20 days.
Beta Was this translation helpful? Give feedback.
All reactions