2.5.4
📦 pytorch-lightningView on GitHub →
✨ 1 features🐛 5 fixes🔧 5 symbols
Summary
This patch release for PyTorch Lightning focuses on bug fixes across checkpointing, callbacks, and strategy integrations. Lightning Fabric also added support for NVIDIA H200 GPUs.
✨ New Features
- Added support for NVIDIA H200 GPUs in `get_available_flops`.
🐛 Bug Fixes
- Fixed `AsyncCheckpointIO` snapshots tensors to avoid race with parameter mutation.
- Fixed `AsyncCheckpointIO` threadpool exception if calling fit or validate more than one.
- Fixed learning rate not being correctly set after using `LearningRateFinder` callback.
- Fixed misalignment column while using rich model summary in `DeepSpeedstrategy`.
- Fixed `RichProgressBar` crashing when sanity checking using val dataloader with 0 len.
🔧 Affected Symbols
AsyncCheckpointIOLearningRateFinderDeepSpeedstrategyRichProgressBarget_available_flops