b8967
📦 llama-cppView on GitHub →
✨ 1 features🔧 1 symbols
Summary
This release primarily reposts binaries, highlighting the addition of Blackwell native NVFP4 support for ggml-cuda, alongside numerous pre-compiled binaries for diverse operating systems and hardware configurations.
✨ New Features
- Added Blackwell native NVFP4 support for ggml-cuda.