Change8

v1.79.dev.1

📦 litellmView on GitHub →
9 features🐛 13 fixes🔧 11 symbols

Summary

This release adds several UI enhancements, new provider and routing features, and numerous bug fixes across management endpoints, streaming, and budget handling.

✨ New Features

  • UI - Model Info Page Health Check
  • Add softgen to projects that are using litellm
  • Add Kimi K2 thinking support
  • UI - Test Key Page shows models based on selected endpoint
  • Add GET list of providers endpoint
  • UI - Invite User searchable team select
  • Add SDK‑focused examples for custom prompt management
  • Router now supports default fallbacks for unknown models
  • AI Gateway – End User Budgets can point max_end_user budget to an ID, applying the default ID to all end users

🐛 Bug Fixes

  • Fix container API link in release page
  • Docs: fix streaming example in README
  • Management Endpoints – inconsistent error responses fixed; non‑existent user now returns proper 404 with consistent schema
  • Allow internal users to access video generation routes
  • LiteLLM Usage now correctly displays key_hash
  • Removed strict master_key check in add_deployment
  • Proxy: corrected date range filtering in /spend/logs endpoint
  • Updated model_cost_map_url to use environment variable
  • Langfuse: handle null usage values to prevent validation errors
  • Apply provided timeout value to ClientTimeout.total
  • Fixed bug where updated spend was not sent to CloudZero
  • Fixed inability to delete MCP server from permission settings
  • Bedrock Knowledge bases – ensure users can access `search_results` for both streaming and non‑streaming responses to /chat/completions

🔧 Affected Symbols

add_deploymentClientTimeout.totalmodel_cost_map_urlroutermax_end_user_budgetvideo_generation_routesLiteLLM_UsageBedrockKnowledgeBase.search_resultsGET /providers endpointGET /spend/logs endpointcustomer_management_endpoints