v1.78.6-nightly
📦 litellm
✨ 3 features🐛 11 fixes🔧 8 symbols
Summary
This release focuses heavily on bug fixes across various providers, including Anthropic, OpenAI Realtime API, and Gemini, alongside introducing new Azure AVA TTS and cost tracking features.
✨ New Features
- Added Azure AVA TTS integration.
- Added Azure AVA (Speech AI) Cost Tracking.
- Implemented ability to read from custom-llm-provider header.
🐛 Bug Fixes
- Anthropic cache_control was incorrectly applied to all content items instead of the last item only.
- Forwarded anthropic-beta headers to Bedrock and VertexAI.
- Added pre and post call functionality for list batches.
- Added function responsible for calling precall.
- Added gpt 4.1 pricing for the response endpoint.
- Fixed wrong request body in json mode documentation.
- Fixed JSON serialization error in Helicone logging by removing OpenTelemetry span from metadata.
- Resolved OpenAI Realtime API integration failure due to websockets.exceptions.PayloadTooBig error.
- Changed max_tokens value to match max_output_tokens for claude sonnet.
- Fixed incorrect status value in responses API when using Gemini.
- Fixed OpenTelemetry Logging functionality.
🔧 Affected Symbols
AnthropicBedrockVertexAIOpenAI Realtime APIclaude sonnetGeminiHelicone loggingOpenTelemetry Logging