Change8

b8373

📦 llama-cppView on GitHub →
🐛 1 fixes🔧 1 symbols

Summary

This release primarily addresses a bug fix related to flash attention dot product precision when using the Vulkan backend.

🐛 Bug Fixes

  • Fixed flash attention dot product precision issue on Vulkan backend.

Affected Symbols