Skip to content

Commit 505c019

Browse files
committed
fix: Remove incorrect newline at the end of granite chat template gen prompt
There should not be one, even for the language models. Branch: GraniteDoclingStopping Signed-off-by: Gabe Goodhart <[email protected]>
1 parent 2de4d2a commit 505c019

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/llama-chat.cpp

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -590,7 +590,7 @@ int32_t llm_chat_apply_template(
590590
ss << message->content << "<|end_of_text|>\n";
591591
}
592592
if (add_ass) {
593-
ss << "<|start_of_role|>assistant<|end_of_role|>\n";
593+
ss << "<|start_of_role|>assistant<|end_of_role|>";
594594
}
595595
} else if (tmpl == LLM_CHAT_TEMPLATE_GIGACHAT) {
596596
// GigaChat template

0 commit comments

Comments
 (0)