Agent does not create Convo Title #7091
Replies: 5 comments 4 replies
-
What model are you using? you should switch the title model as needed. No issues here |
Beta Was this translation helpful? Give feedback.
-
Same here.
...which does not give much information. when I change agent's model to Claude Sonnet 3.7 it works fine, though. |
Beta Was this translation helpful? Give feedback.
-
Same here for agents using openai models with Google tool enabled. |
Beta Was this translation helpful? Give feedback.
-
I think this occurs due to OpenAI reasoning models (o1, o3, o4) not supporting temperatures. Yet, in the titleConvo method the title-generation payload is built with: const modelOptions = {
model,
temperature: 0.2,
presence_penalty: 0,
frequency_penalty: 0,
max_tokens: 16,
}; https://github.com/danny-avila/LibreChat/blob/main/api/app/clients/OpenAIClient.js#L743 if we can set it to 1.0 (or unset it altogether so the default is used?) that might do the trick? debug.log snippet:
error.log snippet:
|
Beta Was this translation helpful? Give feedback.
-
With the introduction of the Responses API, I noticed that the agents that use reasoning models have a conversation title broken, as well as a regular chat with a reasoning model. We exclusively use Azure OpenAI, and my logs show this: {"level":"error","message":"[api/server/controllers/agents/client.js #titleConvo] Error The OPENAI_API_KEY environment variable is missing or empty; either provide it, or instantiate the OpenAI client with an apiKey option, like new OpenAI({ apiKey: 'My API Key' }).","stack":"Error: The OPENAI_API_KEY environment variable is missing or empty; either provide it, or instantiate the OpenAI client with an apiKey option, like new OpenAI({ apiKey: 'My API Key' }).\n at new OpenAI (/app/node_modules/@langchain/openai/node_modules/openai/client.js:86:19)\n at new CustomOpenAIClient (/app/node_modules/@librechat/agents/dist/cjs/llm/openai/index.cjs:53:1)\n at ChatOpenAI._getClientOptions (/app/node_modules/@librechat/agents/dist/cjs/llm/openai/index.cjs:126:27)\n at ChatOpenAI.betaParsedCompletionWithRetry (/app/node_modules/@langchain/openai/dist/chat_models.cjs:2443:37)\n at ChatOpenAI._generate (/app/node_modules/@langchain/openai/dist/chat_models.cjs:2226:35)\n at /app/node_modules/@langchain/core/dist/language_models/chat_models.cjs:254:96\n at Array.map ()\n at ChatOpenAI._generateUncached (/app/node_modules/@langchain/core/dist/language_models/chat_models.cjs:254:67)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async ChatOpenAI.invoke (/app/node_modules/@langchain/core/dist/language_models/chat_models.cjs:92:24)"} |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
since recently an Agent does not create a conversation title... just creates "New Chat" - it works fine when using the models directly in a chat, it fails only in Agent.
Version Information
150116e
Steps to Reproduce
1.) create an Agent
2.) start a conversation
What browsers are you seeing the problem on?
No response
Relevant log output
2025-04-26T15:56:49.605Z error: [api/server/controllers/agents/client.js #titleConvo] Error 400 Invalid parameter: 'response_format' of type 'json_schema' is not supported with this ... [truncated] {"attemptNumber":1,"code":null,"error":{"code":null,"message":"Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs","param":null,"type":"invalid_request_error"},"headers":{"apim-request-id":"144e72ed-a266-4974-be15-dd20ac9237bd","azureml-model-session":"v20250415-5-168444713-4","content-length":"330","content-type":"application/json","date":"Sat, 26 Apr 2025 15:56:49 GMT","ms-azureml-model-error-reason":"model_error","ms-azureml-model-error-statuscode":"400","server":"Kestrel","strict-transport-security":"max-age=31536000; includeSubDomains; preload","x-content-type-options":"nosniff","x-ms-client-request-id":"144e72ed-a266-4974-be15-dd20ac9237bd","x-ms-deployment-name":"PTU_deployment","x-ms-rai-invoked":"true","x-ms-region":"Sweden Central","x-request-id":"ee773f46-0f39-4b1b-96c3-76d8a91cc4b8"},"level":"error","message":"[api/server/controllers/agents/client.js #titleConvo] Error 400 Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs","param":null,"request_id":"ee773f46-0f39-4b1b-96c3-76d8a91cc4b8","retriesLeft":6,"stack":"Error: 400 Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs\n at APIError.generate (/app/node_modules/openai/error.js:45:20)\n at CustomAzureOpenAIClient.makeStatusError (/app/node_modules/openai/core.js:302:33)\n at CustomAzureOpenAIClient.makeRequest (/app/node_modules/openai/core.js:346:30)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async /app/node_modules/@librechat/agents/node_modules/@langchain/openai/dist/chat_models.cjs:2036:29\n at async RetryOperation._fn (/app/node_modules/p-retry/index.js:50:12)","status":400,"type":"invalid_request_error"}
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions