Skip to content

Commit 5161aaf

Browse files
committed
Reduce micro_batch_size to 2 to avoid OOM during backward pass
1 parent 660ffd8 commit 5161aaf

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

scripts/train_grpo.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ class TrainingConfig:
5858
num_train_epochs: int = 1
5959
rollouts_per_example: int = 16
6060
batch_size: int = 32
61-
micro_batch_size: int = 8
61+
micro_batch_size: int = 2 # Keep small to avoid OOM during backward pass
6262
learning_rate: float = 1e-6
6363
max_seq_len: int = 1024 # Reduced - poems are small
6464
max_prompt_len: int = 384

0 commit comments

Comments
 (0)