b8886
📦 llama-cppView on GitHub →
🐛 1 fixes
Summary
This release updates the pre-built binaries for numerous platforms, including new support for ROCm 7.2 and OpenVINO 2026.0 on Linux, and CUDA 12.4/13.1 on Windows. A minor server change was implemented to ignore reasoning content from the transcription API.
🐛 Bug Fixes
- Server now ignores reasoning content received from the transcription API.