Bedrock OpenAI OSS models support #8899
Replies: 2 comments 1 reply
-
I got the same error. Any fix? |
Beta Was this translation helpful? Give feedback.
-
The message is coming directly from Bedrock API: Note that the AWS Bedrock integration currently misidentifies the streaming support behavior for some models, as listed here: From what I can tell, based on the Bedrock OpenAI Docs and using this model in Bedrock's playground, it does not support streaming. However, you can use Bedrock's OpenAI-like endpoint as a custom endpoint: Note, using it this way, we can see streaming still does not work: ![]() It works for me with the following custom endpoint config, explicitly disabling streaming:
![]() |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
The new OpenAI OSS models as available on Bedrock result in
The model is unsupported for streaming.
. As per the AWS blog post they should support streaming.Version Information
v.0.8.0-rc1
Steps to Reproduce
Add Bedrock models:
openai.gpt-oss-120b-1:0
openai.gpt-oss-20b-1:0
Query.
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions