-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Open
Labels
Description
What happened?
When using the gpt-5-chat-latest model, an intelligence model, it throws a LiteLLM error:
litellm.UnsupportedParamsError: gpt-5 models don't support temperature=0.3. Only temperature=1 is supported. To drop unsupported params, set
litellm.drop_params = True.
However, this GPT-5 model should accept the 'temperature' parameter because it is an intelligence model, not a reasoning model.
Here is a reproducible example:
import os
import openai
client = openai.OpenAI(
api_key=os.getenv("litellm_api_key"),
base_ur=os.getenv("litellm_url")
)
client.chat.completions.create(
model="openai/gpt-5-chat-latest",
messages=[
{
"role": "user",
"content": "What is the capital of México?",
},
],
temperature=0.3,
)
Relevant log output
litellm.UnsupportedParamsError: gpt-5 models don't support temperature=0.3. Only temperature=1 is supported. To drop unsupported params set `litellm.drop_params = True`
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.75.6
Twitter / LinkedIn details
No response
juanluissantamaria, aplex, paolomagnani-mxm, delfianto, belcheva and 11 more