Local Models #7
clickbrain
started this conversation in
General
Replies: 1 comment
-
Hey, sorry for the late reply. I have added support for Local Models but function calling is a challenge at the moment. I saw Mistral has function calling support, I will add that soon. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This looks really powerful. Thanks for releasing it. Any chance you will make it work with Local Models using an end point with Ollama or other inference servers?
I just won't use centralized AI services.
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions