-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Fix streaming thinking tags split across multiple chunks #3206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 14 commits
6e145e6
11b5f1f
0f876de
b5c0910
3439159
876ebb2
adc51e6
0818191
f50d4b4
551d035
0998a63
9b598dd
41a38e2
dcac211
b9bdd78
4b7f0c1
ac03e38
5fae762
28578bf
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Large diffs are not rendered by default.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -521,7 +521,7 @@ class StreamedResponse(ABC): | |
| _event_iterator: AsyncIterator[ModelResponseStreamEvent] | None = field(default=None, init=False) | ||
| _usage: RequestUsage = field(default_factory=RequestUsage, init=False) | ||
|
|
||
| def __aiter__(self) -> AsyncIterator[ModelResponseStreamEvent]: | ||
| def __aiter__(self) -> AsyncIterator[ModelResponseStreamEvent]: # noqa: C901 | ||
| """Stream the response as an async iterable of [`ModelResponseStreamEvent`][pydantic_ai.messages.ModelResponseStreamEvent]s. | ||
|
|
||
| This proxies the `_event_iterator()` and emits all events, while also checking for matches | ||
|
|
@@ -580,6 +580,16 @@ def part_end_event(next_part: ModelResponsePart | None = None) -> PartEndEvent | | |
|
|
||
| yield event | ||
|
|
||
| # Flush any buffered content and stream finalize events | ||
| for finalize_event in self._parts_manager.finalize(): | ||
| if isinstance(finalize_event, PartStartEvent): | ||
| if last_start_event: | ||
| end_event = part_end_event(finalize_event.part) | ||
| if end_event: | ||
| yield end_event | ||
| last_start_event = finalize_event | ||
| yield finalize_event | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should we set |
||
|
|
||
| end_event = part_end_event() | ||
| if end_event: | ||
| yield end_event | ||
|
|
@@ -602,6 +612,10 @@ async def _get_event_iterator(self) -> AsyncIterator[ModelResponseStreamEvent]: | |
|
|
||
| def get(self) -> ModelResponse: | ||
| """Build a [`ModelResponse`][pydantic_ai.messages.ModelResponse] from the data received from the stream so far.""" | ||
| # Flush any buffered content before building response | ||
| for _ in self._parts_manager.finalize(): | ||
|
||
| pass | ||
|
|
||
| return ModelResponse( | ||
| parts=self._parts_manager.get_parts(), | ||
| model_name=self.model_name, | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there are only two cases when finalize will have an effect: when we're buffering
<th→ emits aPartStartEvent</th→ emits aPartDeltaEventcoverage is complaining that there's no test running through the
PartDeltaEventbranch of this, so I need to figure out how to test it