How can i pass the output of the OpenAI chat node to the Request Post #3190
Unanswered
simonator1001
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there, I'm trying to pass this output from the LLM in the body of the HTTP call inside flowise but i fail to see how i might get to relate this to a variable and obtain the LLM output. Would anyone has any idea how i could do that? thanks!
Beta Was this translation helpful? Give feedback.
All reactions