1.9.0
Breaking Changes📦 instructorView on GitHub →
⚠ 2 breaking✨ 7 features🐛 6 fixes🔧 9 symbols
Summary
This release introduces Ollama and Writer provider support, improves Gemini and Anthropic integrations, and standardizes VertexAI async parameters. It also enhances error handling with a new exception hierarchy and resolves several dependency conflicts.
⚠️ Breaking Changes
- The `enable_prompt_caching` parameter has been removed from the Anthropic integration. Users should remove this argument from their function calls as it is no longer supported in the current implementation.
- Async parameter naming in the VertexAI client has been standardized, which may require updating parameter names in existing VertexAI async calls.
Migration Steps
- Remove `enable_prompt_caching` from any Anthropic client calls.
- Update VertexAI async calls to match the new standardized parameter naming.
- Update dependency constraints if using google-genai or rich.
✨ New Features
- Improved error handling with a comprehensive exception hierarchy.
- Added Ollama provider support to auto_client.
- Implemented JSON mode for the Writer provider.
- Added Gemini optional support and thought parts filtering in GenAI tool parsing.
- Enabled Audio module compatibility for Windows.
- Added dev and docs to project.optional-dependencies for uv compatibility.
- Updated README and documentation with SEO improvements for asyncio and tenacity.
🐛 Bug Fixes
- Fixed Gemini configuration issues.
- Resolved pyright TypedDict key access error in dump_message.
- Fixed retry mechanism to respect timeout parameters for Ollama compatibility.
- Handled ThinkingBlock in reask_anthropic_json to prevent parsing errors.
- Fixed documentation for dynamic model creation examples.
- Filtered out Gemini thought parts in genai tool parsing.
🔧 Affected Symbols
AnthropicVertexAIOllamaWriterreask_anthropic_jsondump_messageThinkingBlockAudiogenai