Skip to content

Commit c40b52e

Browse files
committed
llama-chat : fix multiple system message for gemma, orion
1 parent 26ff368 commit c40b52e

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/llama-chat.cpp

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -331,7 +331,7 @@ int32_t llm_chat_apply_template(
331331
std::string role(message->role);
332332
if (role == "system") {
333333
// there is no system message for gemma, but we will merge it with user prompt, so nothing is broken
334-
system_prompt = trim(message->content);
334+
system_prompt += trim(message->content);
335335
continue;
336336
}
337337
// in gemma, "assistant" is "model"
@@ -353,7 +353,7 @@ int32_t llm_chat_apply_template(
353353
std::string role(message->role);
354354
if (role == "system") {
355355
// there is no system message support, we will merge it with user prompt
356-
system_prompt = message->content;
356+
system_prompt += message->content;
357357
continue;
358358
} else if (role == "user") {
359359
ss << "Human: ";

0 commit comments

Comments
 (0)