b8890
📦 llama-cppView on GitHub →
✨ 1 features🐛 2 fixes🔧 1 symbols
Summary
This release fixes type and linting errors while improving the logic for parallel tool calls in the chat module by dynamically setting the default based on model capabilities.
✨ New Features
- Improved handling of parallel tool calls by defaulting the setting based on model capabilities.
🐛 Bug Fixes
- Fixed type errors.
- Fixed flake8 errors.