Skip to content

Commit 5c6cdc0

Browse files
committed
updated docs
1 parent b198435 commit 5c6cdc0

File tree

1 file changed

+32
-3
lines changed

1 file changed

+32
-3
lines changed

docs/Trainer/Checkpointing.md

Lines changed: 32 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,40 @@ Lightning will restore the session if you pass an experiment with the same versi
2929
from test_tube import Experiment
3030
3131
exp = Experiment(version=a_previous_version_with_a_saved_checkpoint)
32-
Trainer(experiment=exp)
32+
trainer = Trainer(experiment=exp)
3333
34-
trainer = Trainer(checkpoint_callback=checkpoint_callback)
35-
# the trainer is now restored
34+
# this fit call loads model weights and trainer state
35+
# the trainer continues seamlessly from where you left off
36+
# without having to do anything else.
37+
trainer.fit(model)
3638
```
3739

40+
The trainer restores:
41+
- global_step
42+
- current_epoch
43+
- All optimizers
44+
- All lr_schedulers
45+
- Model weights
46+
47+
You can even change the logic of your model as long as the weights and "architecture" of
48+
the system isn't different. If you add a layer, for instance, it might not work.
49+
50+
At a rough level, here's [what happens inside Trainer](https://github.com/williamFalcon/pytorch-lightning/blob/master/pytorch_lightning/root_module/model_saving.py#L63):
51+
```python
52+
53+
self.global_step = checkpoint['global_step']
54+
self.current_epoch = checkpoint['epoch']
55+
56+
# restore the optimizers
57+
optimizer_states = checkpoint['optimizer_states']
58+
for optimizer, opt_state in zip(self.optimizers, optimizer_states):
59+
optimizer.load_state_dict(opt_state)
3860

61+
# restore the lr schedulers
62+
lr_schedulers = checkpoint['lr_schedulers']
63+
for scheduler, lrs_state in zip(self.lr_schedulers, lr_schedulers):
64+
scheduler.load_state_dict(lrs_state)
3965

66+
# uses the model you passed into trainer
67+
model.load_state_dict(checkpoint['state_dict'])
68+
```

0 commit comments

Comments
 (0)