Skip to content

BaseChatOpenAI handles temperature removal only for lower case names of reasoning models #34003

@drygajlom

Description

@drygajlom

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Package (Required)

  • langchain
  • langchain-openai
  • langchain-anthropic
  • langchain-classic
  • langchain-core
  • langchain-cli
  • langchain-model-profiles
  • langchain-tests
  • langchain-text-splitters
  • langchain-chroma
  • langchain-deepseek
  • langchain-exa
  • langchain-fireworks
  • langchain-groq
  • langchain-huggingface
  • langchain-mistralai
  • langchain-nomic
  • langchain-ollama
  • langchain-perplexity
  • langchain-prompty
  • langchain-qdrant
  • langchain-xai
  • Other / not sure / general

Example Code (Python)

from langchain_core.utils.utils import convert_to_secret_str
from langchain_openai import AzureChatOpenAI

from my_project.auth import AzureAuthProvider

# Setup authentication
auth = AzureAuthProvider()
headers = {xxxxx}
token = convert_to_secret_str(auth.get_token())

headers["Authorization"] = f"Bearer {token.get_secret_value()}"

model_params = {
    "azure_endpoint": "xxxxxxx",
    "azure_deployment": "GPT-5-2025-xx-xx",
    "api_version": "xxxx-xx-xx",
    "api_key": token,
    "timeout": 30,
    "max_retries": 3,
    "default_headers": headers,
    "temperature": 0.7,
}

# Create LLM client
llm = AzureChatOpenAI(**model_params)

response = llm.invoke("Tell me a joke about chickens.")

Error Message and Stack Trace (if applicable)

---------------------------------------------------------------------------
BadRequestError                           Traceback (most recent call last)
Cell In[4], line 1
----> 1 response = llm.invoke("Tell me a joke about chickens.")
      2 print(response)

File ~/my_project/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py:379, in BaseChatModel.invoke(self, input, config, stop, **kwargs)
    365 @override
    366 def invoke(
    367     self,
   (...)    372     **kwargs: Any,
    373 ) -> AIMessage:
    374     config = ensure_config(config)
    375     return cast(
    376         "AIMessage",
    377         cast(
    378             "ChatGeneration",
--> 379             self.generate_prompt(
    380                 [self._convert_input(input)],
    381                 stop=stop,
    382                 callbacks=config.get("callbacks"),
    383                 tags=config.get("tags"),
    384                 metadata=config.get("metadata"),
    385                 run_name=config.get("run_name"),
    386                 run_id=config.pop("run_id", None),
...
-> 1047         raise self._make_status_error_from_response(err.response) from None
   1049     break
   1051 assert response is not None, "could not resolve response (should never happen)"

BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings...

Description

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 24.6.0: Wed Oct 15 21:12:05 PDT 2025; root:xnu-11417.140.69.703.14~1/RELEASE_ARM64_T6030
Python Version: 3.12.3 (main, Sep 30 2025, 12:00:02) [Clang 17.0.0 (clang-1700.0.13.5)]

Package Information

langchain_core: 1.0.1
langchain: 1.0.2
langsmith: 0.4.38
langchain_openai: 1.0.1
langgraph_sdk: 0.2.9

Optional packages not installed

langserve

Other Dependencies

claude-agent-sdk: Installed. No version info available.
httpx: 0.28.1
jsonpatch: 1.33
langchain-anthropic: Installed. No version info available.
langchain-aws: Installed. No version info available.
langchain-community: Installed. No version info available.
langchain-deepseek: Installed. No version info available.
langchain-fireworks: Installed. No version info available.
langchain-google-genai: Installed. No version info available.
langchain-google-vertexai: Installed. No version info available.
langchain-groq: Installed. No version info available.
langchain-huggingface: Installed. No version info available.
langchain-mistralai: Installed. No version info available.
langchain-ollama: Installed. No version info available.
langchain-perplexity: Installed. No version info available.
langchain-together: Installed. No version info available.
langchain-xai: Installed. No version info available.
langgraph: 1.0.1
langsmith-pyo3: Installed. No version info available.
openai: 2.6.1
openai-agents: Installed. No version info available.
opentelemetry-api: 1.38.0
opentelemetry-exporter-otlp-proto-http: 1.38.0
opentelemetry-sdk: 1.38.0
orjson: 3.11.4
packaging: 24.2
pydantic: 2.12.3
pytest: 8.4.1
pyyaml: 6.0.3
requests: 2.32.5
requests-toolbelt: 1.0.0
rich: 13.9.4
tenacity: 9.1.2
tiktoken: 0.12.0
typing-extensions: 4.15.0
vcrpy: Installed. No version info available.
zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing featurelangchainRelated to the package `langchain`openai

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions