v1.77.7.dev-gemma
📦 litellmView on GitHub →
✨ 2 features🐛 9 fixes
Summary
This release focuses on stability and feature enhancements, including fixes for costing, rate limiting, and Anthropic tool calls, alongside the addition of EnkryptAI Guardrails.
✨ New Features
- Added EnkryptAI Guardrails integration to LiteLLM.
- Added pricing details for new Azure AI models.
🐛 Bug Fixes
- Fixed litellm_param based costing.
- Fixed parallel tool calls in the Anthropic passthrough adapter.
- Redacted AWS credentials when redact_user_api_key_info is enabled.
- Fixed dynamic Rate limiter v3 by inserting litellm_model_saturation.
- Fixed sessions not being shared.
- Prevented DB from accidentally overriding config file values if they are empty in DB.
- Temporarily relaxed ResponsesAPIResponse parsing to support custom backends (e.g., vLLM).
- Fixed OpenRouter cache_control to only apply to the last content block.
- Removed panic from a hot path.