Skip to content

Commit 6a2a99f

Browse files
committed
fix: correct iters_per_epoch calculation in ResNet-finetune example
- Removed unnecessary `grads_accumulation` factor from `iters_per_epoch` computation.
1 parent d151eaa commit 6a2a99f

File tree

1 file changed

+1
-1
lines changed
  • examples/resnet_finetune/src

1 file changed

+1
-1
lines changed

examples/resnet_finetune/src/main.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -276,7 +276,7 @@ pub fn train<B: AutodiffBackend>(args: &Args) -> anyhow::Result<()> {
276276
.num_workers(args.num_workers)
277277
.build(valid);
278278

279-
let iters_per_epoch = train_set_size / (args.batch_size * args.grads_accumulation);
279+
let iters_per_epoch = train_set_size / args.batch_size;
280280
let lr_scheduler = CosineAnnealingLrSchedulerConfig::new(
281281
args.learning_rate,
282282
iters_per_epoch * args.num_epochs,

0 commit comments

Comments
 (0)