2.5.2
📦 pytorch-lightningView on GitHub →
✨ 1 features🐛 8 fixes🔧 7 symbols
Summary
This release introduces the `toggled_optimizer` context manager to LightningModule and resolves several bugs related to CLI integration, DDP synchronization, and checkpointing. Users are advised to update `fsspec` for cross-device checkpointing.
Migration Steps
- For cross-device local checkpoints, install `fsspec>=2025.5.0` if unavailable.
✨ New Features
- Add `toggled_optimizer(optimizer)` method to the LightningModule, which is a context manager version of `toggle_optimize` and `untoggle_optimizer`.
🐛 Bug Fixes
- Fixed `save_hyperparameters` not working correctly with `LightningCLI` when there are parsing links applied on instantiation.
- Fixed `logger_connector` has an edge case where step can be a float.
- Fixed Synchronize SIGTERM Handling in DDP to Prevent Deadlocks.
- Fixed case-sensitive model name.
- CLI: resolve jsonargparse deprecation warning.
- Fix: move `check_inputs` to the target device if available during `to_torchscript`.
- Fixed progress bar display to correctly handle iterable dataset and `max_steps` during training.
- Fixed problem for silently supporting `jsonnet`.
🔧 Affected Symbols
LightningModulesave_hyperparametersLightningCLItoggle_optimizeuntoggle_optimizercheck_inputsto_torchscript