Replies: 4 comments 3 replies
-
The last time I checked it was. Which model did you try |
Beta Was this translation helpful? Give feedback.
-
Thanks for the reply. See the failure models below: Failed today: mradermacher/Falcon3-Mamba-R1-v0-GGUF And these yesterday: This one did load: ( Triangle104/Falcon3-Mamba-7B-Instruct-Q8_0-GGUF (tho it rambles after a bit) |
Beta Was this translation helpful? Give feedback.
-
I just tried mamba-2.8b-hf-Q2_K and it worked fine. Though it was slow. Which quant did you use? |
Beta Was this translation helpful? Give feedback.
-
I can confirm that Bartowski Mamba model loads. Look at this Q5 one: https://huggingface.co/PrunaAI/state-spaces-mamba-2.8b-hf-GGUF-smashed/tree/main/state-spaces |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I have tried several Mamba based GGUF, and am getting this error: ggml/src/ggml.c:4461: GGML_ASSERT(ggml_is_matrix(c)) Failed error
Is Koboldcpp compatible with Mamba based GGUF? (If so, can you suggest one).
Beta Was this translation helpful? Give feedback.
All reactions