Change8

2.6.1

Breaking Changes
📦 pytorch-lightningView on GitHub →
1 breaking3 features🐛 9 fixes1 deprecations🔧 13 symbols

Summary

This patch introduces method chaining for freezing/unfreezing modules and adds litlogger integration. It also removes support for Python 3.9 and fixes several bugs related to checkpointing, hyperparameter saving, and distributed sampling.

⚠️ Breaking Changes

  • Removed support for Python 3.9. Users must upgrade to Python 3.10 or newer.

Migration Steps

  1. If you rely on Python 3.9, upgrade your environment to Python 3.10 or newer.

✨ New Features

  • Added method chaining support to `LightningModule.freeze()` and `LightningModule.unfreeze()` by returning `self`.
  • Added litlogger integration.
  • Exposed `weights_only` argument for loading checkpoints in `Fabric.load()` and `Fabric.load_raw()`.

🐛 Bug Fixes

  • Fixed `save_hyperparameters(ignore=...)` behavior so subclass ignore rules now correctly override base class rules.
  • Fixed `LightningDataModule.load_from_checkpoint` to restore the datamodule subclass and hyperparameters.
  • Fixed ``ModelParallelStrategy`` single-file checkpointing when ``torch.compile`` wraps the model, preventing optimizer states from raising ``KeyError`` during save.
  • Sanitized profiler filenames when saving to avoid crashes due to invalid characters.
  • Fixed `StochasticWeightAveraging` behavior when used with infinite epochs.
  • Fixed `_generate_seed_sequence_sampling` function to ensure it produces unique seeds.
  • Fixed `ThroughputMonitor` callback from emitting warnings too frequently.
  • Fixed `DistributedSamplerWrapper` to correctly forward `set_epoch` to the underlying sampler.
  • Fixed DDP notebook CUDA fork check to allow passive initialization when CUDA is not actively used.

Affected Symbols

⚡ Deprecations

  • The `to_torchscript` method is deprecated due to the deprecation of TorchScript in PyTorch.