Replies: 1 comment 1 reply
-
It is possible to run other LLMs locally using LocalAI or other OpenAI compatible local backends. In the config file we can point |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi have you considered using https://github.com/langchain-ai/langchain ?
It would be a nice addition to easily replace the LLM driving it and soon we can run completely opensource model locally.
Beta Was this translation helpful? Give feedback.
All reactions