Change8

v1.81.12-stable

📦 litellmView on GitHub →
🐛 3 fixes

Summary

LiteLLM v1.81.12-stable focuses on stability by fixing several bugs related to parameter passing, streaming responses, and context length validation across various providers.

🐛 Bug Fixes

  • Fixed an issue where the `max_tokens` parameter was not being correctly passed to the Azure OpenAI provider.
  • Resolved a bug causing incorrect handling of streaming responses for certain providers.
  • Addressed an issue where context length validation was sometimes bypassed for specific models.