Change8

v0.12.48

Breaking Changes
📦 llamaindexView on GitHub →
1 breaking4 features🐛 6 fixes🔧 9 symbols

Summary

This release introduces breaking changes to the Context API, adds Cached Content support for Google GenAI, and includes several bug fixes across core, OCI, and LlamaCloud integrations.

⚠️ Breaking Changes

  • The Context API has changed: ctx.get and ctx.set have been replaced by ctx.store.get and ctx.store.get. Code relying on the old methods will fail.

Migration Steps

  1. Update any calls to 'ctx.get()' or 'ctx.set()' to 'ctx.store.get()' and 'ctx.store.set()' respectively when working with Context objects.

✨ New Features

  • Added Cached Content Support to GoogleGenAI LLM integration.
  • Added support for Image prompts in OCI Generative AI Llama models.
  • Optimized Document Hash Checks to reduce KV store roundtrips.
  • Improved memory efficiency by preventing redundant document copies in metadata.

🐛 Bug Fixes

  • Fixed AgentWorkflowStartEvent to automatically convert dictionary-based chat_history into ChatMessage objects.
  • Ensured CallbackManager is correctly applied to the default embed_model.
  • Fixed async retrieval of page figure nodes in LlamaCloud managed indices.
  • Replaced xml library with defusedxml in readers-file for improved security.
  • Fixed IntelEmbedding base implementation.
  • Fixed broken LanceDB managed index tests.

🔧 Affected Symbols

AgentWorkflowStartEventContext.getContext.setContext.store.getContext.store.setGoogleGenAIIntelEmbeddingCallbackManagerLlamaCloudIndex