Skip to content

Commit b337647

Browse files
authored
[Bugfix] Drop empty tool_calls lists to keep assistant replies in chat template (vllm-project#30648)
Signed-off-by: Seokhyun An <[email protected]>
1 parent a524d1b commit b337647

File tree

1 file changed

+11
-6
lines changed

1 file changed

+11
-6
lines changed

vllm/entrypoints/chat_utils.py

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1629,12 +1629,17 @@ def _postprocess_messages(messages: list[ConversationMessage]) -> None:
16291629
# so, for messages that have tool_calls, parse the string (which we get
16301630
# from openAI format) to dict
16311631
for message in messages:
1632-
if (
1633-
message["role"] == "assistant"
1634-
and "tool_calls" in message
1635-
and isinstance(message["tool_calls"], list)
1636-
):
1637-
for item in message["tool_calls"]:
1632+
if message["role"] == "assistant" and "tool_calls" in message:
1633+
tool_calls = message.get("tool_calls")
1634+
if not isinstance(tool_calls, list):
1635+
continue
1636+
1637+
if len(tool_calls) == 0:
1638+
# Drop empty tool_calls to keep templates on the normal assistant path.
1639+
message.pop("tool_calls", None)
1640+
continue
1641+
1642+
for item in tool_calls:
16381643
# if arguments is None or empty string, set to {}
16391644
if content := item["function"].get("arguments"):
16401645
if not isinstance(content, (dict, list)):

0 commit comments

Comments
 (0)