Skip to content

[BUG] Stream can't be disabled in OpenAIModel #778

@javixeneize

Description

@javixeneize

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

1.0.1

Tools Package Version

N/A

Tools used

N/A

Python Version

3.13

Operating System

N/A

Installation Method

pip

Steps to Reproduce

When using OpenAIModel, there is no parameter to set streaming to false

Expected Behavior

Ideally, OpenAIModel should support that parameter, something like:

model = OpenAIModel(
client_args={
"api_key": "",
},
# **model_config
model_id="gpt-4o",
stream=False,
params={
"max_tokens": 1000,
"temperature": 0.7,
}
)

Actual Behavior

stream is not a valid parameter for OpenAIModel

Additional Context

No response

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions