Change8

v4.55.2

Breaking Changes
📦 transformersView on GitHub →
1 breaking🐛 1 fixes🔧 2 symbols

Summary

Patch release 4.55.2 fixes a critical regression in Flash Attention 2 (FA2) generations caused by a missing utility import in version 4.55.1.

⚠️ Breaking Changes

  • The 4.55.1 release was broken for Flash Attention 2 (FA2) generations due to a missing import: 'prepare_fa_kwargs_from_position_ids'.

Migration Steps

  1. Upgrade from 4.55.1 to 4.55.2 immediately if using Flash Attention 2.

🐛 Bug Fixes

  • Fixed a regression in Flash Attention 2 (FA2) generations caused by a git merge conflict that resulted in a missing utility function import.

🔧 Affected Symbols

modeling_flash_attention_utilsprepare_fa_kwargs_from_position_ids