Skip to content

Commit 54085d1

Browse files
awaelchlilantiga
authored andcommitted
Avoid warning about logging interval for fast dev run (#18550)
(cherry picked from commit 670b490)
1 parent 72c097e commit 54085d1

File tree

3 files changed

+9
-1
lines changed

3 files changed

+9
-1
lines changed

src/lightning/pytorch/CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
4141
- Fixed visual glitch with the TQDM progress bar leaving the validation bar incomplete before switching back to the training display ([#18503](https://github.com/Lightning-AI/lightning/pull/18503))
4242

4343

44+
- Fixed false positive warning about logging interval when running with `Trainer(fast_dev_run=True)` ([#18550](https://github.com/Lightning-AI/lightning/pull/18550))
45+
46+
4447
## [2.0.7] - 2023-08-14
4548

4649
### Added

src/lightning/pytorch/loops/fit_loop.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -277,7 +277,7 @@ def setup_data(self) -> None:
277277
trainer.val_check_batch = int(self.max_batches * trainer.val_check_interval)
278278
trainer.val_check_batch = max(1, trainer.val_check_batch)
279279

280-
if trainer.loggers and self.max_batches < trainer.log_every_n_steps:
280+
if trainer.loggers and self.max_batches < trainer.log_every_n_steps and not trainer.fast_dev_run:
281281
rank_zero_warn(
282282
f"The number of training batches ({self.max_batches}) is smaller than the logging interval"
283283
f" Trainer(log_every_n_steps={trainer.log_every_n_steps}). Set a lower value for log_every_n_steps if"

tests/tests_pytorch/trainer/test_dataloaders.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@
1717
import numpy
1818
import pytest
1919
import torch
20+
from lightning_utilities.test.warning import no_warning_call
2021
from torch.utils.data import RandomSampler
2122
from torch.utils.data.dataloader import DataLoader
2223
from torch.utils.data.dataset import Dataset, IterableDataset
@@ -693,6 +694,10 @@ def test_warning_with_small_dataloader_and_logging_interval(tmpdir):
693694
)
694695
trainer.fit(model)
695696

697+
with no_warning_call(UserWarning, match="The number of training batches"):
698+
trainer = Trainer(default_root_dir=tmpdir, fast_dev_run=True, log_every_n_steps=2)
699+
trainer.fit(model)
700+
696701

697702
def test_warning_with_iterable_dataset_and_len(tmpdir):
698703
"""Tests that a warning message is shown when an IterableDataset defines `__len__`."""

0 commit comments

Comments
 (0)