[AI] Propagate Task cancellation when streaming#16123
Conversation
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. |
Summary of ChangesThis pull request improves resource management by ensuring that asynchronous tasks created during streaming operations are correctly cancelled when the stream is terminated. By attaching Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Activity
|
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request implements task cancellation for AsyncThrowingStream across the Chat, GenerativeAIService, and GenerativeModel classes by capturing the Task and invoking cancel() on termination. The review feedback suggests adding explicit cancellation checks before updating the chat history to prevent inconsistent states and before finishing the continuation to avoid emitting unnecessary errors when a task is cancelled.
| let stream = try model.generateContentStream(request, generationConfig: generationConfig) | ||
| return AsyncThrowingStream { continuation in | ||
| Task { | ||
| let task = Task { |
There was a problem hiding this comment.
The task should check for cancellation before modifying the chat history. If the stream is cancelled, appending newContent and the partial aggregatedContent to _history (lines 138-142) might lead to an inconsistent chat state. Consider adding a check like guard !Task.isCancelled else { return } before updating the history.
| return AsyncThrowingStream { continuation in | ||
| let responseStream = generativeAIService.loadRequestStream(request: generateContentRequest) | ||
| Task { | ||
| let task = Task { |
There was a problem hiding this comment.
Consider checking for task cancellation before finishing the continuation. If the task is cancelled during the loop, the code currently proceeds to line 357 and might call continuation.finish(throwing:) with an emptyContent error if no responses were yielded yet. It would be cleaner to exit early if Task.isCancelled is true.
Added
onTerminationhandlers to theTasks created inChat,GenerativeModelandGenerativeAIServicewhen streaming. Previously these unstructured tasks would not be cancelled when the outerAsyncThrowingStreamwas cancelled, for example when navigating away from a SwiftUI View that initiated astreamResponseusingGenerativeModelSession, which transitively depends on these types.#no-changelog