Ollama support
#9369
Replies: 2 comments 2 replies
-
You can use https://github.com/sigoden/aichat and integrate it to some degree, especially if you are using a terminal multiplexer or a terminal with split windows.
|
Beta Was this translation helpful? Give feedback.
1 reply
-
This is the main feature I'm missing in helix, adding gpt support via an LSP and locally run models with ollama sounds like a great way to preserve the privacy while enabling helix to use gpt models. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
LLMs are more and more commonly used to provide coding support. Ollama is a great project that acts as a middleware to use local LLMs.
Adding support to it would be great.
Website: https://ollama.ai
Beta Was this translation helpful? Give feedback.
All reactions