Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Commit 285860e

Browse files
committed
add explanatory comment on topk min check
1 parent 3b550f1 commit 285860e

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

dist_run.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -219,7 +219,7 @@ def _batch_decode_next_tokens(
219219

220220
# Uses top-k sampling if temperature is not 1.0, otherwise use argmax
221221
if temperature != 1.0:
222-
top_k = min(topk, vocab_size)
222+
top_k = min(topk, vocab_size) # Ensure top-k is not greater than vocab size
223223
top_k_logits, top_k_indices = torch.topk(next_token_logits, k=top_k, dim=-1)
224224
probs = torch.softmax(top_k_logits, dim=-1)
225225
next_token_indices = torch.multinomial(probs, num_samples=1).squeeze(-1)

0 commit comments

Comments
 (0)