Change8

b8201

📦 llama-cppView on GitHub →
🐛 1 fixes🔧 1 symbols

Summary

This release addresses a critical bug in WebGPU wait logic for inflight jobs and provides updated pre-compiled binaries for numerous platforms and hardware accelerators.

🐛 Bug Fixes

  • Fixed WebGPU wait logic for inflight jobs by refactoring wait and submit to operate on vector<wgpu::FutureWaitInfo> and ensuring only completed futures are deleted.

Affected Symbols