How can I use a vllm deployed LLM in Flowise? #2788
Unanswered
benjaminmateev
asked this question in
Q&A
Replies: 1 comment
-
As long as you are configuring the vllm.entrypoints to openai and wrap vllm with something like fastapi, you can use the chatlocalAI node to interact. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey Flowise community,
I am experimenting with Flowise as a solution to quickly build LLM flows for our copilot application. We fine-tune open-source models like Mistral and LLama3 and run them on a Kubeflow cluster, via kserve and vllm. I have not been able to find any information on using vLLM in flowise or a node in the nodes overview that could be used for that case.
Is this at all possible? Is there some other node I can use that basically works the same way?
Thank you so much!
Ben
Beta Was this translation helpful? Give feedback.
All reactions