Skip to content

Commit fb37826

Browse files
carmoccalantiga
authored andcommitted
Extend warning about reducing non floating types (#18847)
(cherry picked from commit 245865d)
1 parent db5a7db commit fb37826

File tree

1 file changed

+4
-1
lines changed
  • src/lightning/pytorch/trainer/connectors/logger_connector

1 file changed

+4
-1
lines changed

src/lightning/pytorch/trainer/connectors/logger_connector/result.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -212,7 +212,10 @@ def update(self, value: _VALUE, batch_size: int) -> None:
212212
warning_cache.warn(
213213
# do not include the value to avoid cache misses
214214
f"You called `self.log({self.meta.name!r}, ...)` in your `{self.meta.fx}` but the value needs to"
215-
f" be floating point. Converting it to {dtype}."
215+
f" be floating to be reduced. Converting it to {dtype}."
216+
" You can silence this warning by converting the value to floating point yourself."
217+
" If you don't intend to reduce the value (for instance when logging the global step or epoch) then"
218+
f" you can use `self.logger.log_metrics({{{self.meta.name!r}: ...}})` instead."
216219
)
217220
value = value.to(dtype)
218221
if value.dtype not in (torch.float32, torch.float64):

0 commit comments

Comments
 (0)