b8981
📦 llama-cppView on GitHub →
🐛 1 fixes
Summary
This release primarily contains a bug fix related to how prompt tokens are handled during reasoning budget sampling. It also provides numerous pre-compiled binaries for various operating systems and hardware configurations.
🐛 Bug Fixes
- Prompt tokens are no longer passed to the reasoning budget sampler.