Skip to content

Lightning v2.5.2

Latest
Compare
Choose a tag to compare
@Borda Borda released this 20 Jun 15:56

Notable changes in this release

PyTorch Lightning

Changed
  • Add enable_autolog_hparams argument to Trainer (#20593)
  • Add toggled_optimizer(optimizer) method to the LightningModule, which is a context manager version of toggle_optimize and untoggle_optimizer (#20771)
  • For cross-device local checkpoints, instruct users to install fsspec>=2025.5.0 if unavailable (#20780)
  • Check param is of nn.Parameter type for pruning sanitization (#20783)
Fixed
  • Fixed save_hyperparameters not working correctly with LightningCLI when there are parsing links applied on instantiation (#20777)
  • Fixed logger_connector has an edge case where step can be a float (#20692)
  • Fixed Synchronize SIGTERM Handling in DDP to Prevent Deadlocks (#20825)
  • Fixed case-sensitive model name (#20661)
  • CLI: resolve jsonargparse deprecation warning (#20802)
  • Fix: move check_inputs to the target device if available during to_torchscript (#20873)
  • Fixed progress bar display to correctly handle iterable dataset and max_steps during training (#20869)
  • Fixed problem for silently supporting jsonnet (#20899)

Lightning Fabric

Changed
  • Ensure correct device is used for autocast when mps is selected as Fabric accelerator (#20876)
Removed
  • Fix: TransformerEnginePrecision conversion for layers with bias=False (#20805)

Full commit list: 2.5.1 -> 2.5.2

Contributors

We thank all folks who submitted issues, features, fixes, and doc changes. It's the only way we can collectively make Lightning ⚡ better for everyone, nice job!

In particular, we would like to thank the authors of the pull-requests above, in no particular order:

@adamjstewart, @Armannas, @bandpooja, @Borda, @chanokin, @duydl, @GdoongMathew, @KAVYANSHTYAGI, @mauvilsa, @muthissar, @rustamzh, @siemdejong

Thank you ❤️ and we hope you'll keep them coming!