Allow models loading on different backends #739
alexpatcas
started this conversation in
Ideas
Replies: 1 comment
-
|
This is possible today. Load Load Both can be loaded at the same time if you started the server with |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Just like the title says. It would be great if we can load some models on vulkan and some models on rocm.
For example gpt-oss is great on vulkan, but gemma or any VL model struggles really bad.
Allowing different models on different back ends, would allo us the best of both words
Beta Was this translation helpful? Give feedback.
All reactions