[Question]: Serverless hosted endpoints failing with new version #8223
Replies: 8 comments 10 replies
-
Logs in the backend. |
Beta Was this translation helpful? Give feedback.
-
I am using endpoints from azure. |
Beta Was this translation helpful? Give feedback.
-
endpoints:
azureOpenAI:
groups:
- group: "Azure AI Deepseek"
assistants: false
apiKey: "${AZURE_DEEPSEEK_API_KEY}"
baseURL: "https://resource.region.models.ai.azure.com/v1/"
version: "2024-08-01-preview"
serverless: true
models:
DeepSeek-R1:
deploymentName: "DeepSeek-R1"
- group: "Azure AI Foundry"
apiKey: "${LIBRECHAT_AZURE_OAI_API_KEY}"
baseURL: "https://resource-region.services.ai.azure.com/models"
version: "2024-05-01-preview"
serverless: true
dropParams: ["stream_options", "user"]
models:
Mistral-Large-2411:
deploymentName: "Mistral-Large-2411"
grok-3:
deploymentName: "grok-3" |
Beta Was this translation helpful? Give feedback.
-
Same for me with litellm custom endpoints using the latest v0.7.9-rc1, built like an hour ago; here is my fresh message that was going to be a new issue, but I scanned the recent list and found this issue thread. My description of this same situation: Howdy all, In builds at some point after post-v0.7.8 release, my LibreChat gives errors for any custom endpoint model in the general chat models menu if the endpoint is not also made explicitly available to agents via an entry in The "Thinking" provider is not available for use with Agents. Please go to your agent's settings and select a currently available provider. In this case, "Thinking" is a custom endpoint, via our LiteLLM, that bundles & presents "thinking" models. Now, we totally have endpoints / models that we do not want to present as choices in Agents...because they do not work well with agents! Is it right that now, basically, any & all custom endpoints must always be added to (and displayed in) the agents model picker, or they are 100% unsuable in all librechat contexts? If so, why is there even Or am I missing something? :D Here is an example of one of our custom endpoints:
Thanks! |
Beta Was this translation helpful? Give feedback.
-
The models we hosted are via Azure foundry but visible as a endpoint to user and not inside azure open ai.. This setup is not working with new release v0.7.9-rc1 @danny-avila |
Beta Was this translation helpful? Give feedback.
-
Hi any response/update/activity on this.....? just following up :) |
Beta Was this translation helpful? Give feedback.
-
This is still very, very much a problem as of the most-recent build ( If you do a generic, general chat in the main chat window and pick from the menu a model that is not associated with
So, if you want to offer any models for general chat but do not want them to appear as models available for agents, then...too bad, error time. ¯\(ツ)/¯ Here's my dev deployment's
|
Beta Was this translation helpful? Give feedback.
-
Closed by #8487 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I am experiencing an issue with the serverless supported models like - xai and deepseek models.
I am using the latest 7.9-rc1 image for my setup. Sharing a video snippet for better understanding.
serverless.endpoint.mp4
I have not got any tool activated or anything, just using the models from deepseek and xai and it stops working.
Also, one question- can I disable the file search from the normal chat via endpoints.
Beta Was this translation helpful? Give feedback.
All reactions