Added
omit_temperatureparameter forOpenAILLM: Reasoning models (o1, o3, gpt-5-nano, etc.) only support the default temperature value. Setomit_temperature=Trueto omit thetemperatureparameter from API calls entirely.omit_temperatureparameter forAgenticQueryPipeline: Propagated to the internally createdOpenAILLMwhen no custom LLM is provided.