b8748
📦 llama-cppView on GitHub →
🐛 1 fixes🔧 1 symbols
Summary
This release fixes an issue in llama-server where the --alias flag conflicted with model presets, and provides updated pre-compiled binaries for broad platform compatibility.
🐛 Bug Fixes
- The llama-server now correctly ignores the --alias flag when --models-preset is used, preventing conflicts when initializing the router models.