Skip to content

Commit c0c3742

Browse files
ananthsubtchaton
authored andcommitted
Check max_time when setting defaults for min/max epochs (#9072)
Co-authored-by: tchaton <[email protected]>
1 parent 5dffe74 commit c0c3742

File tree

3 files changed

+11
-3
lines changed

3 files changed

+11
-3
lines changed

CHANGELOG.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,15 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
66

77

8+
## [1.4.5] - 2021-08-31
9+
10+
- Fixed not setting a default value for `max_epochs` if `max_time` was specified on the `Trainer` constructor ([#9072](https://github.com/PyTorchLightning/pytorch-lightning/pull/9072))
11+
812
## [1.4.4] - 2021-08-24
913

1014
- Fixed a bug in the binary search mode of auto batch size scaling where exception was raised if the first trainer run resulted in OOM ([#8954](https://github.com/PyTorchLightning/pytorch-lightning/pull/8954))
1115
- Fixed a bug causing logging with `log_gpu_memory='min_max'` not working ([#9013](https://github.com/PyTorchLightning/pytorch-lightning/pull/9013))
1216

13-
1417
## [1.4.3] - 2021-08-17
1518

1619
- Fixed plateau scheduler stepping on incomplete epoch ([#8861](https://github.com/PyTorchLightning/pytorch-lightning/pull/8861))

pytorch_lightning/trainer/trainer.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -380,8 +380,8 @@ def __init__(
380380
self.tuner = Tuner(self)
381381

382382
fit_loop = FitLoop(
383-
min_epochs=(1 if (min_epochs is None and min_steps is None) else min_epochs),
384-
max_epochs=(1000 if (max_epochs is None and max_steps is None) else max_epochs),
383+
min_epochs=(1 if (min_epochs is None and min_steps is None and max_time is None) else min_epochs),
384+
max_epochs=(1000 if (max_epochs is None and max_steps is None and max_time is None) else max_epochs),
385385
)
386386
training_epoch_loop = TrainingEpochLoop(min_steps, max_steps)
387387
training_batch_loop = TrainingBatchLoop()

tests/callbacks/test_timer.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,11 @@ def on_fit_start(self):
4242
trainer.fit(TestModel())
4343
assert "callbacks list already contains a Timer" in caplog.text
4444

45+
seconds = 1
46+
trainer = Trainer(max_time=dict(seconds=seconds))
47+
assert trainer.max_epochs is None
48+
assert trainer.max_steps is None
49+
4550

4651
@pytest.mark.parametrize(
4752
"duration,expected",

0 commit comments

Comments
 (0)