-
-
Notifications
You must be signed in to change notification settings - Fork 719
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Better interoperability with llama.cpp when llama.cpp is set with:
--models-dir
and
--no-models-autoload
Now when I try to load a llama.cpp server model from the extension, it gives me this message:
Error
400 model is not loaded
Why do I want this when I have Ollama?
For some reason I have better luck with llama.cap in doing some tasks than with Ollama.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request