Skip to content

Commit 656ef3c

Browse files
authored
Merge branch 'master' into fix/19427/double-iter
2 parents 2f9cd44 + 37f559e commit 656ef3c

File tree

6 files changed

+23
-7
lines changed

6 files changed

+23
-7
lines changed

.github/markdown-links-config.json

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,5 +22,9 @@
2222
"Accept-Encoding": "zstd, br, gzip, deflate"
2323
}
2424
}
25-
]
25+
],
26+
"timeout": "20s",
27+
"retryOn429": true,
28+
"retryCount": 5,
29+
"fallbackRetryDelay": "20s"
2630
}

README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,12 @@ ______________________________________________________________________
5555

5656
 
5757

58+
# Why PyTorch Lightning?
59+
60+
Training models in plain PyTorch is tedious and error-prone - you have to manually handle things like backprop, mixed precision, multi-GPU, and distributed training, often rewriting code for every new project. PyTorch Lightning organizes PyTorch code to automate those complexities so you can focus on your model and data, while keeping full control and scaling from CPU to multi-node without changing your core code. But if you want control of those things, you can still opt into more DIY.
61+
62+
Fun analogy: If PyTorch is Javascript, PyTorch Lightning is ReactJS or NextJS.
63+
5864
# Lightning has 2 core packages
5965

6066
[PyTorch Lightning: Train and deploy PyTorch at scale](#why-pytorch-lightning).

_notebooks

requirements/pytorch/base.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# NOTE: the upper bound for the package version is only set for CI stability, and it is dropped while installing this package
22
# in case you want to preserve/enforce restrictions on the latest compatible version, add "strict" as an in-line comment
33

4-
torch >=2.1.0, <2.8.0
4+
torch >=2.1.0, <=2.8.0
55
tqdm >=4.57.0, <4.68.0
66
PyYAML >5.4, <6.1.0
77
fsspec[http] >=2022.5.0, <2025.6.0

src/lightning/pytorch/callbacks/model_checkpoint.py

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -133,9 +133,15 @@ class ModelCheckpoint(Checkpoint):
133133
will only save checkpoints at epochs 0 < E <= N
134134
where both values for ``every_n_epochs`` and ``check_val_every_n_epoch`` evenly divide E.
135135
save_on_train_epoch_end: Whether to run checkpointing at the end of the training epoch.
136-
If this is ``False``, then the check runs at the end of the validation.
136+
If ``True``, checkpoints are saved at the end of every training epoch.
137+
If ``False``, checkpoints are saved at the end of validation.
138+
If ``None`` (default), checkpointing behavior is determined based on training configuration.
139+
If ``check_val_every_n_epoch != 1``, checkpointing will not be performed at the end of
140+
every training epoch. If there are no validation batches of data, checkpointing will occur at the
141+
end of the training epoch. If there is a non-default number of validation runs per training epoch
142+
(``val_check_interval != 1``), checkpointing is performed after validation.
137143
enable_version_counter: Whether to append a version to the existing file name.
138-
If this is ``False``, then the checkpoint files will be overwritten.
144+
If ``False``, then the checkpoint files will be overwritten.
139145
140146
Note:
141147
For extra customization, ModelCheckpoint includes the following attributes:

src/lightning/pytorch/core/module.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -466,10 +466,10 @@ def log(
466466
)
467467

468468
# make sure user doesn't introduce logic for multi-dataloaders
469-
if "/dataloader_idx_" in name:
469+
if add_dataloader_idx and "/dataloader_idx_" in name:
470470
raise MisconfigurationException(
471471
f"You called `self.log` with the key `{name}`"
472-
" but it should not contain information about `dataloader_idx`"
472+
" but it should not contain information about `dataloader_idx` when `add_dataloader_idx=True`"
473473
)
474474

475475
value = apply_to_collection(value, (Tensor, numbers.Number), self.__to_tensor, name)

0 commit comments

Comments
 (0)