Ollama (or any other HTTP POST based LLM) integration #1022
Unanswered
glemarivero
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I wanted to know if it was possible to integrate ollama or any other HTTP POST based model, as an LLM component.
Since it is basically running a server in localhost that I am able to query it with POST, then it should be easy to integrate it as an LLM block, but I wasn't able to.
Thanks for your help!
Beta Was this translation helpful? Give feedback.
All reactions