Using subtrans with oobabooga openai extension #69
Alihkhawaher
started this conversation in
General
Replies: 1 comment
-
After further testing looks like the model probably have a 200 prompt token limitation |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
First, thank you for this amazing app.
Here is my issue
I am having no luck using subtrans with oobabooga
my setup is oobabooga started with the following commands
python .\server.py --verbose --api --extension openai
then I loaded "inception-mbzuai/jais-13b" model, it works in oobabooga. This is the only model that can translate to Arabic accurately
in subtrans I am using the following settings
I get the following error in subtrans
and in oobabooga
I have been trying to make this work for hours and I finally gave up. Hopefully someone can help with this?
Beta Was this translation helpful? Give feedback.
All reactions