Skip to content
Discussion options

You must be logged in to vote

Hello everyone,

I've resolved the issue! The problem was that the length of the labels exceeded Whisper's max_target_positions configuration. For instance, the default max_target_positions for whisper-large-v3 is 448 tokens. You can either trim your labels or adjust the configuration.

Additionally, I submitted a pull request that aims to prevent such issues in future versions of transformers. For more information on similar issues, feel free to check out this issue and this one.

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
3 replies
@AmirMohammadFakhimi
Comment options

@starxa2
Comment options

@AmirMohammadFakhimi
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by AmirMohammadFakhimi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants