KeyError: ('torch', 'BoolStorage') when converting NSQL model #2587
Replies: 3 comments
-
You cant since the model you are trying to use is not based on the LLaMA architecture. |
Beta Was this translation helpful? Give feedback.
-
Thank you, is there a way I can make the conversion myself or is it a big lift? |
Beta Was this translation helpful? Give feedback.
-
It is not the conversion to ggml file format that is the main problem. You need inference code written for each architecture. llama.cpp is currently only supporting the LLaMA architecture. Inference code written for some other architectures can be found at https://github.com/ggerganov/ggml/tree/master/examples but none of them supports the architecture used by the model you are trying to use. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
Forgive me if this is a stupid question as I am still learning things with llama.cpp.
I am trying to convert https://huggingface.co/NumbersStation/nsql-350M to llama.cpp using the
convert.py
script.When I do so, I get the following error:
Do you have any tips on what I can do to get around this? Can I change the module type from BoolStorage to IntStorage or similar if it is already pretrained?
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions