v0.14.4
📦 llamaindexView on GitHub →
✨ 7 features🐛 10 fixes🔧 12 symbols
Summary
This release introduces support for Claude 4.5 and structured outputs for OpenAI-like models, while primarily addressing dependency issues and bug fixes across the ecosystem.
Migration Steps
- Update llama-index-core to 0.14.4 to resolve installation issues.
- Update specific provider packages (e.g., anyscale, azure-openai, fireworks) to resolve OpenAI dependency conflicts.
✨ New Features
- Added support for Anthropic Claude Sonnet 4.5 in Anthropic and Bedrock Converse LLMs.
- Added structured outputs support for OpenAILike and OpenAI LLMs.
- Introduced Bedrock AgentCore Memory integration.
- Introduced ApacheSolrVectorStore integration.
- Updated MistralAI LLM list.
- Updated ScrapegraphAI tool integration.
- Added py.typed to OpenRouter and Anthropic packages for better type checking support.
🐛 Bug Fixes
- Fixed OpenAI dependency issues across multiple embedding, LLM, tool, and vector store packages.
- Fixed pre-release installation issues in llama-index-core.
- Fixed authorization header setup logic in text-embeddings-inference.
- Fixed ValueError in Google GenAI when ChatMessage contains multiple blocks.
- Fixed unhashable type error in Ollama stream chat when using tools.
- Fixed Sarvam integration bugs and typos.
- Fixed ConfluenceReader to respect the cloud parameter when fetching child pages.
- Fixed ServiceNow reader issue where pages with empty/null latest content could not be fetched.
- Fixed index creation logic in Postgres vector store.
- Fixed playwright tests and ChromaVectorStore docstrings.
🔧 Affected Symbols
AnthropicBedrockConverseOpenAILikeBedrockAgentCoreMemoryApacheSolrVectorStoreOllama.stream_chatConfluenceReaderServiceNowReaderPostgresVectorStoreChromaVectorStore.queryGoogleGenAISarvamLLM