Skip to content

[AI] Propagate Task cancellation when streaming#16123

Draft
andrewheard wants to merge 1 commit into
mainfrom
ah/ai-streaming-task-cancellation
Draft

[AI] Propagate Task cancellation when streaming#16123
andrewheard wants to merge 1 commit into
mainfrom
ah/ai-streaming-task-cancellation

Conversation

@andrewheard
Copy link
Copy Markdown
Contributor

Added onTermination handlers to the Tasks created in Chat, GenerativeModel and GenerativeAIService when streaming. Previously these unstructured tasks would not be cancelled when the outer AsyncThrowingStream was cancelled, for example when navigating away from a SwiftUI View that initiated a streamResponse using GenerativeModelSession, which transitively depends on these types.

#no-changelog

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

@andrewheard andrewheard added this to the 12.13.0 - M180 milestone Apr 27, 2026
@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

This pull request improves resource management by ensuring that asynchronous tasks created during streaming operations are correctly cancelled when the stream is terminated. By attaching onTermination handlers to the stream continuations, the system now properly propagates cancellation signals, which is particularly important for preventing memory leaks or redundant processing when UI components, such as SwiftUI views, are dismissed.

Highlights

  • Task Cancellation Propagation: Implemented onTermination handlers for AsyncThrowingStream continuations in Chat, GenerativeModel, and GenerativeAIService.
  • Resource Management: Ensured that unstructured Tasks are properly cancelled when their associated streams are terminated, preventing unnecessary background work.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Activity
  • The pull request was created by andrewheard.
  • The gemini-code-assist bot provided documentation on how to use its features.
  • The author requested a summary of the pull request using the /gemini summary command.

@andrewheard
Copy link
Copy Markdown
Contributor Author

/gemini review

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request implements task cancellation for AsyncThrowingStream across the Chat, GenerativeAIService, and GenerativeModel classes by capturing the Task and invoking cancel() on termination. The review feedback suggests adding explicit cancellation checks before updating the chat history to prevent inconsistent states and before finishing the continuation to avoid emitting unnecessary errors when a task is cancelled.

let stream = try model.generateContentStream(request, generationConfig: generationConfig)
return AsyncThrowingStream { continuation in
Task {
let task = Task {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The task should check for cancellation before modifying the chat history. If the stream is cancelled, appending newContent and the partial aggregatedContent to _history (lines 138-142) might lead to an inconsistent chat state. Consider adding a check like guard !Task.isCancelled else { return } before updating the history.

return AsyncThrowingStream { continuation in
let responseStream = generativeAIService.loadRequestStream(request: generateContentRequest)
Task {
let task = Task {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Consider checking for task cancellation before finishing the continuation. If the task is cancelled during the loop, the code currently proceeds to line 357 and might call continuation.finish(throwing:) with an emptyContent error if no responses were yielded yet. It would be cleaner to exit early if Task.isCancelled is true.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant