Replies: 1 comment
-
Unfortunately this sounds like a new feature that might need to get implemented since people see Ollama-LiteLLM as an alternative to LocalAI (e.g. Dify), would be sweet to see a bridge between the two or at the very least model-parity |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi I have been using the combination of ollama + litellm earlier
However, I have recently stumbled across LocalAI and I am exploring it.
since I was using Ollam earlier, I have downloaded many big models using Ollama and therefore I was wondering if I can reference those previously downloaded models in local AI? if yes How?
Beta Was this translation helpful? Give feedback.
All reactions