-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed as not planned
Closed as not planned
Copy link
Labels
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
if isinstance(first_chunk, _utils.Unset):
raise UnexpectedModelBehavior( # pragma: no cover
'Streamed response ended without content or tool calls'
)
Example Code
async def _process_streamed_response(
self, response: AsyncStream[ChatCompletionChunk], model_request_parameters: ModelRequestParameters
) -> OpenAIStreamedResponse:
"""Process a streamed response, and prepare a streaming response to return."""
peekable_response = _utils.PeekableAsyncStream(response)
first_chunk = await peekable_response.peek()
if isinstance(first_chunk, _utils.Unset):
raise UnexpectedModelBehavior( # pragma: no cover
'Streamed response ended without content or tool calls'
)
# When using Azure OpenAI and a content filter is enabled, the first chunk will contain a `''` model name,
# so we set it from a later chunk in `OpenAIChatStreamedResponse`.
model_name = first_chunk.model or self._model_name
if not first_chunk.created:
# patch time
first_chunk.created = int(time.time())
return OpenAIStreamedResponse(
model_request_parameters=model_request_parameters,
_model_name=model_name,
_model_profile=self.profile,
_response=peekable_response,
_timestamp=number_to_datetime(first_chunk.created),
_provider_name=self._provider.name,
_provider_url=self._provider.base_url,
)Python, Pydantic AI & LLM client version
python 3.10
pydantic-ai 1.3.0
llm dashscope qwen3-coder-plus