titleModel
for custom endpoint
#9002
Unanswered
wipash
asked this question in
Troubleshooting
Replies: 4 comments 1 reply
-
Beta Was this translation helpful? Give feedback.
0 replies
-
Weird, it's the same on two different instances of LibreChat I'm running. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
-
I tried setting the shared endpoint setting as well, with no change: endpoints:
custom:
- name: LiteLLM
apiKey: "$${LITELLM_API_KEY}"
baseURL: "http://litellm.litellm.svc.cluster.local:4000"
models:
default: ["azure-gpt-4.1"]
fetch: true
titleConvo: true
titleModel: openai/gpt-4.1-mini
- name: OpenRouter
apiKey: "$${OPENROUTER_API_KEY}"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["openai/gpt-4o"]
fetch: true
titleConvo: true
titleModel: openai/gpt-4.1-nano
modelDisplayLabel: "OpenRouter"
assistants:
disableBuilder: false
agents:
recursionLimit: 50
disableBuilder: false
capabilities:
# - "execute_code"
- "file_search"
- "actions"
- "tools"
- "artifacts"
- "ocr"
- "chain"
- "web_search"
all:
streamRate: 35
titleConvo: true
titleModel: openai/gpt-4.1-mini
titleEndpoint: LiteLLM |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I have a custom endpoint defined as follows:
When starting a new conversation with a modelSpec or a model directly from this endpoint, LibreChat uses the current model for title generation instead of the defined titleModel.
I have not yet been able to test this against standard endpoints (eg OpenAI directly).
Version Information
ghcr.io/danny-avila/librechat-dev:latest@sha256:a4a51b3cc6e2e0f28060207c1c62b7bd6fe0d30278264cb84d96caa39837dfbe
Steps to Reproduce
Define a custom endpoint as above
Start a new conversation
LiteLLM logs show title request sent to the same model as the chat
What browsers are you seeing the problem on?
No response
Relevant log output
LiteLLM request, showing incorrect model (should be
azure-gpt-4o-mini
):Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions