How can I use model.transcribe for multiple concurrent request #1390
Unanswered
Guneetsinghtuli
asked this question in
Q&A
Replies: 1 comment
-
Im look for an answer on this same topic |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am running a flask server with whisper
on running multiple concurrent request to the server I am getting an exception
The size of tensor a must match the size of tensor at non-singleton dimension
It works fine for a single request but not with concurrent multiple request.
Beta Was this translation helpful? Give feedback.
All reactions