Replies: 7 comments 5 replies
-
Try changing this: Line 295 in 2037b65 to:
|
Beta Was this translation helpful? Give feedback.
-
I'm considering TPU as well. It would be great if anyone has succeeded. 🙏 |
Beta Was this translation helpful? Give feedback.
-
I have not used TPU on Colab yet - but since no one has mentioned in the thread so far, note that there seems to be a special version of Getting Started with PyTorch on Cloud TPUs This is the suggested command |
Beta Was this translation helpful? Give feedback.
-
From what I understand, the model would need to be quantized first. |
Beta Was this translation helpful? Give feedback.
-
Tried on Google Cloud TPU, not work 😵💫 |
Beta Was this translation helpful? Give feedback.
-
as of #1277 it's possible to run on TPU, but only for Kaggle TPU |
Beta Was this translation helpful? Give feedback.
-
try running whisper-jax for transcription, works for me and is nearly 60x real-time on tpu v5 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Is it possible to use the TPU in Colab? I've been using the GPU (cuda) but have run into rate limits. Colab also offers a TPU instead of a GPU, I'd like to use it. I tried 'xla' (and all other devices ipu, xpu, mkldnn, opengl, opencl, ideep, hip, ve, ort, mps, xla, lazy, vulkan, meta, hpu), but none of them worked. They all give the error:
How can I use TPU with whisper?
Beta Was this translation helpful? Give feedback.
All reactions