Contect size limitation #75
touseefurrehmanofficial
started this conversation in
General
Replies: 1 comment 3 replies
-
As in pocketpal readme they said they used llama.cpp also for running gguf models, so maybe their repo can help in solving this problem |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Shubham, I hope you are doing well. First, thanks for resolving recent issues in llminference.cpp. I'm opening this discussion on the topic of context size limitation in this app. Can you please tell me that is this issue is because of llama.cpp or in smollm module. So that maybe I can research on it and solve it if I can.
Beta Was this translation helpful? Give feedback.
All reactions