Skip to content

Commit 01bd3ce

Browse files
authored
Merge branch 'master' into dependabot-pip-requirements-click-8.3.0
2 parents 6126877 + 4662d0c commit 01bd3ce

File tree

7 files changed

+18
-12
lines changed

7 files changed

+18
-12
lines changed

.github/workflows/probot-check-group.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,14 +12,14 @@ jobs:
1212
required-jobs:
1313
runs-on: ubuntu-latest
1414
if: github.event.pull_request.draft == false
15-
timeout-minutes: 61 # in case something is wrong with the internal timeout
15+
timeout-minutes: 71 # in case something is wrong with the internal timeout
1616
steps:
1717
- uses: Lightning-AI/[email protected]
1818
env:
1919
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2020
with:
2121
job: check-group
2222
interval: 180 # seconds
23-
timeout: 60 # minutes
23+
timeout: 70 # minutes
2424
maintainers: "Lightning-AI/lai-frameworks"
2525
owner: "carmocca"

.lightning/workflows/fabric.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ trigger:
44
pull_request:
55
branches: ["master", "release/stable"]
66

7-
timeout: "55" # minutes
7+
timeout: "60" # minutes
88
parametrize:
99
matrix: {}
1010
include:

.lightning/workflows/pytorch.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ trigger:
44
pull_request:
55
branches: ["master", "release/stable"]
66

7-
timeout: "55" # minutes
7+
timeout: "60" # minutes
88
parametrize:
99
matrix: {}
1010
include:

docs/source-pytorch/common/checkpointing_intermediate.rst

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,13 @@ For fine-grained control over checkpointing behavior, use the :class:`~lightning
2121
checkpoint_callback = ModelCheckpoint(dirpath="my/path/", save_top_k=2, monitor="val_loss")
2222
trainer = Trainer(callbacks=[checkpoint_callback])
2323
trainer.fit(model)
24-
checkpoint_callback.best_model_path
24+
25+
# Access best and last model checkpoint directly from the callback
26+
print(checkpoint_callback.best_model_path)
27+
print(checkpoint_callback.last_model_path)
28+
# Or via the trainer
29+
print(trainer.checkpoint_callback.best_model_path)
30+
print(trainer.checkpoint_callback.last_model_path)
2531
2632
Any value that has been logged via *self.log* in the LightningModule can be monitored.
2733

requirements/pytorch/test.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ numpy >1.20.0, <1.27.0
1212
onnx >1.12.0, <1.20.0
1313
onnxruntime >=1.12.0, <1.23.0
1414
onnxscript >= 0.1.0, < 0.5.0
15-
psutil <7.0.1 # for `DeviceStatsMonitor`
15+
psutil <7.1.1 # for `DeviceStatsMonitor`
1616
pandas >2.0, <2.4.0 # needed in benchmarks
1717
fastapi # for `ServableModuleValidator` # not setting version as re-defined in App
1818
uvicorn # for `ServableModuleValidator` # not setting version as re-defined in App

requirements/typing.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
mypy==1.18.1
1+
mypy==1.18.2
22
torch==2.8.0
33

44
types-Markdown

src/lightning/pytorch/callbacks/model_checkpoint.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -204,11 +204,11 @@ class ModelCheckpoint(Checkpoint):
204204
... )
205205
206206
# retrieve the best checkpoint after training
207-
checkpoint_callback = ModelCheckpoint(dirpath='my/path/')
208-
trainer = Trainer(callbacks=[checkpoint_callback])
209-
model = ...
210-
trainer.fit(model)
211-
checkpoint_callback.best_model_path
207+
>>> checkpoint_callback = ModelCheckpoint(dirpath='my/path/')
208+
>>> trainer = Trainer(callbacks=[checkpoint_callback])
209+
>>> model = ... # doctest: +SKIP
210+
>>> trainer.fit(model) # doctest: +SKIP
211+
>>> print(checkpoint_callback.best_model_path) # doctest: +SKIP
212212
213213
.. tip:: Saving and restoring multiple checkpoint callbacks at the same time is supported under variation in the
214214
following arguments:

0 commit comments

Comments
 (0)