Add special token to llama 2 #764
Unanswered
generalsvr
asked this question in
Q&A
Replies: 1 comment
-
No, in this case, you would need to use the tokens:
- "<end>" |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello! How can I add a new token to the model? I need a special separator token to interrupt streaming response. Can I just simply add a new token to special_tokens config? Like this:
Beta Was this translation helpful? Give feedback.
All reactions