How to use any model (onnx or gguf) with tauri-plugin-llm
#15013
Unanswered
Keshav-writes-code
asked this question in
Q&A
Replies: 1 comment
-
|
you may be better off asking the plugin maintainers directly in their repo https://github.com/crabnebula-dev/tauri-plugin-llm - doesn't look good though, pretty sure you'll have to modify the plugin source code. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
so, i am trying to use a small language model called :
SmolLM2-135M-Instructwhich is quite smaller than the smallest supported, Qwen's 3B model that i wish to integrate in a tauri app. can't get it to work withtauri-plugin-llm.my question is how can i bypass
tauri-plugin-llm's model name validaton to run any model i wantBeta Was this translation helpful? Give feedback.
All reactions