Skip to content

Commit 4b007e6

Browse files
sohamtiwari3120awaelchlicarmocca
authored andcommitted
[bugfix] Changed CometLogger to stop modifying metrics in place (#9150)
Co-authored-by: Adrian Wälchli <[email protected]> Co-authored-by: Carlos Mocholí <[email protected]>
1 parent ceb8bdf commit 4b007e6

File tree

3 files changed

+18
-5
lines changed

3 files changed

+18
-5
lines changed

CHANGELOG.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,15 +8,16 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
88
## [1.4.5] - 2021-08-31
99

1010
- Fixed reduction using `self.log(sync_dict=True, reduce_fx={mean,max})` ([#9142](https://github.com/PyTorchLightning/pytorch-lightning/pull/9142))
11-
12-
1311
- Fixed not setting a default value for `max_epochs` if `max_time` was specified on the `Trainer` constructor ([#9072](https://github.com/PyTorchLightning/pytorch-lightning/pull/9072))
12+
- Fixed the CometLogger, no longer modifies the metrics in place. Instead creates a copy of metrics before performing any operations ([#9150](https://github.com/PyTorchLightning/pytorch-lightning/pull/9150))
13+
1414

1515
## [1.4.4] - 2021-08-24
1616

1717
- Fixed a bug in the binary search mode of auto batch size scaling where exception was raised if the first trainer run resulted in OOM ([#8954](https://github.com/PyTorchLightning/pytorch-lightning/pull/8954))
1818
- Fixed a bug causing logging with `log_gpu_memory='min_max'` not working ([#9013](https://github.com/PyTorchLightning/pytorch-lightning/pull/9013))
1919

20+
2021
## [1.4.3] - 2021-08-17
2122

2223
- Fixed plateau scheduler stepping on incomplete epoch ([#8861](https://github.com/PyTorchLightning/pytorch-lightning/pull/8861))

pytorch_lightning/loggers/comet.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -241,11 +241,11 @@ def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:
241241
def log_metrics(self, metrics: Dict[str, Union[torch.Tensor, float]], step: Optional[int] = None) -> None:
242242
assert rank_zero_only.rank == 0, "experiment tried to log from global_rank != 0"
243243
# Comet.ml expects metrics to be a dictionary of detached tensors on CPU
244-
for key, val in metrics.items():
244+
metrics_without_epoch = metrics.copy()
245+
for key, val in metrics_without_epoch.items():
245246
if is_tensor(val):
246-
metrics[key] = val.cpu().detach()
247+
metrics_without_epoch[key] = val.cpu().detach()
247248

248-
metrics_without_epoch = metrics.copy()
249249
epoch = metrics_without_epoch.pop("epoch", None)
250250
metrics_without_epoch = self._add_prefix(metrics_without_epoch)
251251
self.experiment.log_metrics(metrics_without_epoch, step=step, epoch=epoch)

tests/loggers/test_comet.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@
1515
from unittest.mock import DEFAULT, patch
1616

1717
import pytest
18+
from torch import tensor
1819

1920
from pytorch_lightning import Trainer
2021
from pytorch_lightning.loggers import CometLogger
@@ -220,3 +221,14 @@ def test_comet_epoch_logging(comet, comet_experiment, tmpdir, monkeypatch):
220221
logger = CometLogger(project_name="test", save_dir=tmpdir)
221222
logger.log_metrics({"test": 1, "epoch": 1}, step=123)
222223
logger.experiment.log_metrics.assert_called_once_with({"test": 1}, epoch=1, step=123)
224+
225+
226+
@patch("pytorch_lightning.loggers.comet.CometExperiment")
227+
@patch("pytorch_lightning.loggers.comet.comet_ml")
228+
def test_comet_metrics_safe(comet, tmpdir, monkeypatch):
229+
"""Test that CometLogger.log_metrics doesn't do inplace modification of metrics."""
230+
_patch_comet_atexit(monkeypatch)
231+
logger = CometLogger(project_name="test", save_dir=tmpdir)
232+
metrics = {"tensor": tensor([[1.0, 0.0], [0.0, 1.0]], requires_grad=True), "epoch": 1}
233+
logger.log_metrics(metrics)
234+
assert metrics["tensor"].requires_grad

0 commit comments

Comments
 (0)