Skip to content
Discussion options

You must be logged in to vote

You can run Ollama or LM studio on desktop and serve it on local network. That's the only option I know but there may be other ways.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by PixeroJan
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants