b8555
Breaking Changes📦 llama-cppView on GitHub →
⚠ 1 breaking✨ 1 features🐛 1 fixes🔧 4 symbols
Summary
This release focuses on cleaning up server configuration by removing the `verbose_prompt` parameter and reverting a related change, alongside providing numerous updated pre-built binaries for diverse hardware and operating systems.
⚠️ Breaking Changes
- The `verbose_prompt` parameter has been removed from the server component. Users relying on this parameter for verbose output must find alternative logging/debugging methods.
Migration Steps
- If you were using the `--verbose-prompt` parameter for llama-server, remove it from your command line arguments.
✨ New Features
- Switched from using `set_excludes` to `set_examples` in an unspecified context (likely related to server configuration or model loading).
🐛 Bug Fixes
- Reverted a previous change related to respecting the `verbose_prompt` parameter (commit 8ed885cf375b2c8ba641c661f3667df70b9797f4).