generating weird answers #60
Replies: 1 comment
-
Try formatting the prompt in the Alpaca format, or use the Instruct mode from the UI. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
first of all, thanks a lot for the amazing project. My GPU is 3060 12gb and cant run the 13b model, viand somehow oobabooga doesnt work on my CPU, Then i found this project, its so conveinent and easy to deploy!
But when i tried this on my computer i got really weird answers from models (also quite slow)



The model im using is this one: https://huggingface.co/anon8231489123/gpt4-x-alpaca-13b-native-4bit-128g/tree/main/gpt4-x-alpaca-13b-ggml-q4_1-from-gptq-4bit-128g. and here are the running time and response i got, these are clearly wrong, did i do anything wrong here? thanks!
Beta Was this translation helpful? Give feedback.
All reactions