Replies: 1 comment
-
Yeap, KoboldCpp should handle all formats automatically as it infers the file type at load time. Let me know if a specific format is broken. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
It appears over at gpt4all they have moved towards gpt-j. As such they have released a ggml. They now direct users to use https://github.com/nomic-ai/gpt4all-chat for inference. Most of this seems to be based on ggerganov's ggml repo. One thing of note is their gptj.cpp isn't up to date with ggml-org/ggml@0265f08. So first thought is, "huh, wonder if this will load fine on koboldcpp?", answer is, yep!
Beta Was this translation helpful? Give feedback.
All reactions