Skip to content

[Bug]: OpenAI GPT-5 Chat model does not support "temperature" parameter #13781

@juanluissantamaria

Description

@juanluissantamaria

What happened?

When using the gpt-5-chat-latest model, an intelligence model, it throws a LiteLLM error:

litellm.UnsupportedParamsError: gpt-5 models don't support temperature=0.3. Only temperature=1 is supported. To drop unsupported params, set litellm.drop_params = True.

However, this GPT-5 model should accept the 'temperature' parameter because it is an intelligence model, not a reasoning model.

Here is a reproducible example:

import os
import openai

client = openai.OpenAI(
    api_key=os.getenv("litellm_api_key"),
    base_ur=os.getenv("litellm_url")
)

client.chat.completions.create(
    model="openai/gpt-5-chat-latest",
    messages=[
        {
            "role": "user",
            "content": "What is the capital of México?",
        },
    ],
    temperature=0.3,
)

Relevant log output

litellm.UnsupportedParamsError: gpt-5 models don't support temperature=0.3. Only temperature=1 is supported. To drop unsupported params set `litellm.drop_params = True`

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.75.6

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions