Change8

v4.51.3

📦 transformersView on GitHub →
1 features🐛 2 fixes🔧 2 symbols

Summary

This patch release introduces support for the GLM-4 model and includes several fixes for PyTorch version compatibility, specifically regarding FlexAttention.

✨ New Features

  • Added support for GLM-4 model architecture.

🐛 Bug Fixes

  • Improved handling of torch versioning within FlexAttention.
  • Fixed edge cases related to torch version detection.

🔧 Affected Symbols

GLM4flexattn