Skip to content

Release v0.1.23

Latest

Choose a tag to compare

@mazy06000 mazy06000 released this 11 Feb 06:05
59f7c46

Added

  • omit_temperature parameter for OpenAILLM: Reasoning models (o1, o3, gpt-5-nano, etc.) only support the default temperature value. Set omit_temperature=True to omit the temperature parameter from API calls entirely.
  • omit_temperature parameter for AgenticQueryPipeline: Propagated to the internally created OpenAILLM when no custom LLM is provided.