Skip to content

Commit 495aa44

Browse files
awaelchlilexierule
authored andcommitted
1.4.9 release commit
1 parent eb49d2c commit 495aa44

File tree

4 files changed

+5
-7
lines changed

4 files changed

+5
-7
lines changed

CHANGELOG.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,13 +5,14 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
66

77

8-
## [unreleased] - 2021-??-??
8+
## [1.4.9] - 2021-09-30
99

1010
- Moved the gradient unscaling in `NativeMixedPrecisionPlugin` from `pre_optimizer_step` to `post_backward` ([#9606](https://github.com/PyTorchLightning/pytorch-lightning/pull/9606))
1111
- Fixed gradient unscaling being called too late, causing gradient clipping and gradient norm tracking to be applied incorrectly ([#9606](https://github.com/PyTorchLightning/pytorch-lightning/pull/9606))
1212
- Fixed `lr_find` to generate same results on multiple calls ([#9704](https://github.com/PyTorchLightning/pytorch-lightning/pull/9704))
1313
- Fixed `reset` metrics on validation epoch end ([#9717](https://github.com/PyTorchLightning/pytorch-lightning/pull/9717))
14-
14+
- Fixed input validation for `gradient_clip_val`, `gradient_clip_algorithm`, `track_grad_norm` and `terminate_on_nan` Trainer arguments ([#9595](https://github.com/PyTorchLightning/pytorch-lightning/pull/9595))
15+
- Reset metrics before each task starts ([#9410](https://github.com/PyTorchLightning/pytorch-lightning/pull/9410))
1516

1617

1718
## [1.4.8] - 2021-09-22

pytorch_lightning/__about__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
import time
22

33
_this_year = time.strftime("%Y")
4-
__version__ = "1.4.8"
4+
__version__ = "1.4.9"
55
__author__ = "William Falcon et al."
66
__author_email__ = "[email protected]"
77
__license__ = "Apache-2.0"

pytorch_lightning/trainer/trainer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1112,7 +1112,7 @@ def _run_sanity_check(self, ref_model):
11121112
self.logger_connector.reset_results()
11131113
self.logger_connector.reset_metrics()
11141114

1115-
self.call_hook("on_sanity_check_start")
1115+
self.on_sanity_check_start()
11161116

11171117
# reload dataloaders
11181118
self._evaluation_loop.reload_evaluation_dataloaders()

tests/checkpointing/test_trainer_checkpoint.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -75,9 +75,6 @@ def validation_step(self, batch, batch_idx):
7575
results.append(deepcopy(trainer.callback_metrics))
7676
best_model_paths.append(trainer.checkpoint_callback.best_model_path)
7777

78-
for idx in range(len(results) - 1):
79-
assert results[idx]["val_loss"] > results[idx + 1]["val_loss"]
80-
8178
for idx, best_model_path in enumerate(best_model_paths):
8279
if idx == 0:
8380
assert best_model_path.endswith(f"epoch=0{idx}.ckpt")

0 commit comments

Comments
 (0)