Skip to content

Commit 68dbde5

Browse files
[Bugfix] remove duplicate tokens streamed in required tool choice streaming (vllm-project#23312)
Signed-off-by: Jason Cheng <[email protected]> Co-authored-by: Chauncey <[email protected]>
1 parent 04ad0dc commit 68dbde5

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

vllm/entrypoints/openai/serving_chat.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -828,9 +828,6 @@ async def chat_completion_stream_generator(
828828
history_tool_call_cnt += 1
829829
tools_streamed[i] = True
830830

831-
# update the previous values for the next iteration
832-
previous_texts[i] = current_text
833-
834831
# handle streaming deltas for tools with "auto" tool choice
835832
# and reasoning parser
836833
elif tool_choice_auto and self.reasoning_parser:

0 commit comments

Comments
 (0)