Custom Endpoint 400 (no body) error #5786
-
I've just cloned the repo, so the only changes I've made have been to try and add a custom endpoint. I am running a model in Azure AI Foundry that is not OpenAI or AI Services, so I think a custom endpoint is the only way to connect. I am able to connect to it directly through Here is my setup in librechat.yaml:
If I make forcePrompt true then I get this error: I also don't quite understand what the models field is. Is it appended onto the URL somewhere when making an API call? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Hard to say without your config, but you can also try via azure serverless config: https://www.librechat.ai/docs/configuration/azure#serverless-inference-endpoints |
Beta Was this translation helpful? Give feedback.
-
What config would that be? I tried the serverless method:
And saw the same error in the UI: These are the errors in the console:
I do have a serverless deployment in Azure AI Foundry, but the endpoint I have is different from the format I can see in their docs: While I have: |
Beta Was this translation helpful? Give feedback.
Hard to say without your config, but you can also try via azure serverless config:
https://www.librechat.ai/docs/configuration/azure#serverless-inference-endpoints