Migrating to LlamaIndex v0.14.13
Version v0.14.13 introduces 1 breaking change. This guide details how to update your code.
Released: 1/21/2026
1
Breaking Changes
2
Migration Steps
17
Affected Symbols
⚠️ Check Your Code
If you use any of these symbols, you need to read this guide:
ChatMemoryBufferMemoryCodeSplittermean_aggReActChatFormatterBedrock Embedding ModelsOllamaVoyageAI modelsRayIngestionPipelineAnthropic structured predict methodsChatMessage (in bedrock-converse)OpenAI structured output JSON schemaYouRetrieverMilvus add/delete operationsmongodb async integrationneo4j vector store metadata handlingOpenSearch vector clientBreaking Changes
●Issue #1
The `ChatMemoryBuffer` class has been replaced by the generic `Memory` class in `llama-index-core`. Users relying on `ChatMemoryBuffer` must update their code to use `Memory` instead.
Migration Steps
- 1Replace usage of `ChatMemoryBuffer` with `Memory` in `llama-index-core`.
- 2When using `llama-index-vector-stores-milvus`, update calls to add/delete operations to use `milvus_partition_name` instead of the previous partition parameter.
Release Summary
This release introduces distributed data ingestion via Ray, token-based code splitting, and new integrations like Apertis LLM and Volcengine MySQL vector store. Notably, ChatMemoryBuffer has been replaced by the generic Memory class.
Need More Details?
View the full release notes and all changes for LlamaIndex v0.14.13.
View Full Changelog