v0.14.18
📦 llamaindexView on GitHub →
✨ 1 features🐛 9 fixes⚡ 1 deprecations🔧 2 symbols
Summary
This release primarily focuses on dependency updates and deprecating Python 3.9 support across numerous packages. Core improvements include aligning text match filters and fixing several bugs related to chat streaming, metadata preservation, and structured output parsing.
✨ New Features
- Align text match filters across core and vector backends in llama-index-core.
🐛 Bug Fixes
- Preserve chat history on incomplete stream consumption in chat_engine.
- Guard against ZeroDivisionError in LlamaDebugHandler._get_time_stats_from_event_pairs.
- Add stacklevel=2 to warnings.warn() for accurate caller reporting.
- Use apostprocess_nodes() in async retrieval paths.
- Fix racy stream_chat memory assertion by using >= 1 in tests.
- Preserve response metadata in async _aretrieve_from_object in core.
- Preserve non-ASCII schema descriptions in PydanticOutputParser.
- structured_predict() returns default values for single-field models in core.
- Fix openai mimetype guess.
Affected Symbols
⚡ Deprecations
- Python 3.9 support is deprecated across multiple packages (e.g., llama-index-agent-agentmesh, llama-index-core, etc.).