How to integrate Custom LLM API endpoints #4721
Unanswered
pradeepb89
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We have a custom LLM API endpoints provided by a company and we are asked to use it in "Azure OpenAI API" credentials.

Surprisingly it works with n*n but not on FlowiseAI. Did I miss something ?
Kindly help :)
In FlowiseAI "Azure OpenAI Api Instance Name" is mandatory but we don't have one. Even with the subscription name, the flowise sdk will try to connect to ".azure.com" which is not expected. May be I am wrong.
Also use AzureChatOpenAI from langchain in python with the same credentials, it works as expected.
Beta Was this translation helpful? Give feedback.
All reactions