Whisper CLI compatible client using CTranslate2 and Faster-Whisper #1137
Replies: 6 comments 8 replies
-
Thanks I'll definitely be checking this out! |
Beta Was this translation helpful? Give feedback.
-
Amazing! |
Beta Was this translation helpful? Give feedback.
-
Yes, starting from version 0.8 and higher using --model you can select large-v1 and large-v2 as in the OpenAI client. |
Beta Was this translation helpful? Give feedback.
-
Works great, I will be changing freesubtitles.ai to use this, thanks a lot! |
Beta Was this translation helpful? Give feedback.
-
Freesubtitles.ai is now running faster-whisper using this CLI, thanks a lot! Works like a charm! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello
I have built a Whisper command line client compatible with original OpenAI client. It uses CTranslate2 and Faster-whisper that is up to 4 times faster than openai/whisper for the same accuracy while using less memory.
See: https://pypi.org/project/whisper-ctranslate2/
Github: https://github.com/jordimas/whisper-ctranslate2
I hope that you find it useful
Beta Was this translation helpful? Give feedback.
All reactions