Skip to content

Commit 0171884

Browse files
committed
fix inference rebatching bug
1 parent 9379cbd commit 0171884

File tree

1 file changed

+1
-1
lines changed
  • applications/ColossalChat/coati/experience_maker

1 file changed

+1
-1
lines changed

applications/ColossalChat/coati/experience_maker/naive.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -140,7 +140,7 @@ def make_experience(
140140
num_actions = 0
141141

142142
for inference_mini_batch_id in range(0, input_ids.size(0), self.inference_batch_size):
143-
s, e = inference_mini_batch_id, (inference_mini_batch_id + 1) * self.inference_batch_size
143+
s, e = inference_mini_batch_id, inference_mini_batch_id + self.inference_batch_size
144144
if input_ids[s:e].size(0) == 0:
145145
break
146146
sequences = generate(self.actor, input_ids[s:e], self.tokenizer, **generate_kwargs)

0 commit comments

Comments
 (0)