Skip to content

Commit 4907499

Browse files
authored
Fixes chat template application to choices (#67)
We actually don't need to use the tokenizer.apply_chat_template for the possible choices, since the correct logic is taken care of in get_examples_with_chat_template, which adds a generation prompt start before the tokens
1 parent 449817f commit 4907499

File tree

1 file changed

+0
-6
lines changed

1 file changed

+0
-6
lines changed

src/lighteval/tasks/lighteval_task.py

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -602,12 +602,6 @@ def create_requests_from_tasks( # noqa: C901
602602
doc.num_effective_few_shots = num_effective_few_shots
603603
doc.num_asked_few_shots = num_fewshot
604604
doc.ctx = ctx
605-
if use_chat_template:
606-
doc.choices = [
607-
lm.tokenizer.apply_chat_template([{"role": "assistant", "content": choice}])
608-
for choice in doc.choices
609-
]
610-
611605
# Constructing the requests
612606
docs[TaskExampleId(cur_task_name, doc_id_seed)] = doc
613607
reqs = task.construct_requests(doc, ctx, doc_id_seed, cur_task_name)

0 commit comments

Comments
 (0)