On-Prem AI / Local Hosted Models #2371
Unanswered
Clankcoll
asked this question in
3. Q&A / helpdesk
Replies: 1 comment
-
We support full local LLM setup, pick OpenAI as the provider and put in the Base URL+port you would use to connect to it in other clients for chat. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
is there an way to implement local (HomeLab) hosted AI Models which are not public reachable.
I clicked fast through the extension setup but saw no way of using a Model hosted in my Own network (sometimes its hosted on the same host sometimes on a different but still same VLAN and IP range).
kind regards
-C
Beta Was this translation helpful? Give feedback.
All reactions