Change8

v1.81.6.rc.4

📦 litellmView on GitHub →
🐛 3 fixes

Summary

LiteLLM v1.81.6.rc.4 focuses on stability by fixing several bugs related to parameter passing (like max_tokens for Azure) and response streaming across various providers.

🐛 Bug Fixes

  • Fixed an issue where the `max_tokens` parameter was not being correctly passed to the Azure OpenAI provider.
  • Resolved a bug causing incorrect handling of streaming responses for certain providers.
  • Fixed an issue where the `timeout` parameter was ignored for some API calls.