Help with spacy-llm Llama2 config #324
Unanswered
fireblonde
asked this question in
Q&A
Replies: 1 comment 17 replies
-
Hi @fireblonde! Let's start with the second error message. Have you
See the "Gated models on Hugging Face" section here. |
Beta Was this translation helpful? Give feedback.
17 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am following the tutorial for Llama2. My config is as follows:
and I am getting this error:
I tried using all of the available llama2 models. If I use the smallest model, then I get this error (similar to another discussion):
Can you please help me with this? Is there a possibility to use
accelerate
and if yes, can you help me with a more detailed instructions?Thank you so much in advance! Looking forward to your answers! :)
Beta Was this translation helpful? Give feedback.
All reactions