Replies: 1 comment
-
It is possible to run other LLMs locally using LocalAI or other OpenAI compatible local backends. In the config file we can point |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
surely we can get a local method of running this with https://github.com/jmorganca/ollama
Beta Was this translation helpful? Give feedback.
All reactions