Change8

b8708

📦 llama-cppView on GitHub →
1 features🐛 1 fixes

Summary

This release focuses on updating pre-compiled binaries across multiple operating systems and hardware targets, including new support for ROCm 7.2 and CUDA 13.1, alongside minor test cleanup.

✨ New Features

  • New binaries provided for macOS/iOS, Linux (including Vulkan, ROCm 7.2, OpenVINO), Windows (including CUDA 12.4, CUDA 13.1, Vulkan, SYCL, HIP), and openEuler platforms.

🐛 Bug Fixes

  • Removed obsolete .mjs script from tests.