Replies: 1 comment
-
Opened a feature request issue for visibility: #7763 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I think LLMs that directly work on raw bytes will become more interesting in the near future. Some interesting work in that direction:
I think it would be useful if something like
LLAMA_VOCAB_TYPE_RAW_BYTES
would be added toenum llama_vocab_type
but I don't know what kind of changes that would imply elsewhere. That kind of vocabulary would still require special tokens of course.Beta Was this translation helpful? Give feedback.
All reactions