Change8

v1.80.7.dev.3

📦 litellmView on GitHub →
23 features🐛 30 fixes1 deprecations🔧 18 symbols

Summary

This release focuses heavily on UI improvements, infrastructure updates, and expanding provider support, including new features for guardrails and cost tracking. A key change is the deprecation of the legacy `/spend/logs` endpoint in favor of `/spend/logs/v2`.

Migration Steps

  1. If you were using the `spend/logs` endpoint, please update your calls to use `spend/logs/v2` instead.

✨ New Features

  • Add guardrails for pass through endpoints
  • UI - allow adding pass through guardrails through UI
  • Add Provider publicai.co
  • Add regex-based tool_name/tool_type matching for tool-permission
  • Update new anthropic feats as reviewed
  • Add nova embedding support
  • Added support for twelvelabs pegasus
  • Add passthrough cost tracking for veo
  • Add better handling in image generation for gemini models
  • WatsonX - allow passing zen_api_key dynamically
  • JWT Auth - AI Gateway, allow using regular OIDC flow with user info endpoints
  • Add new model`fireworks_ai /kimi-k2-instruct-0905`
  • (feat) Generic Guardrail API - allows guardrail providers to add INSTANT support for LiteLLM w/out PR to repo
  • Add `claude-opus-4-5` alias to pricing data
  • Add support of audio transcription for OVHcloud
  • Add context window exception mapping for Together AI
  • Add Embedding API support for github-copilot
  • Guardrail API V2 - user api key metadata, session id, specify input type (request/response), image support
  • Add `vllm` batch+files API support
  • Add other routes in jwt auth
  • Litellm bedrock OpenAI model support
  • feat: add experimental latest-user filtering for Bedrock
  • feat(provider): add Z.AI (Zhipu AI) as built-in provider

🐛 Bug Fixes

  • Do not include plaintext message in exception
  • Change Add Fallback Modal to use Antd Select
  • Vector Store List Endpoint Returns 404
  • Request and Response Panel JSONViewer
  • UI - Fallbacks Immediately Deleting before API resolves
  • Remove Feature Flags
  • Migrate Anthropic provider to azure ai
  • Fix streaming error validation
  • Handle cohere v4 embed response dictionary format for bedrock
  • acompletion throws error with SambaNova models
  • Allow wildcard routes for nonproxy admin (SCIM)
  • Fix metadata tags and model name display in UI for Azure passthrough + Add cost tracking for responses API
  • Respect custom llm provider in header
  • Remove not compatible beta header from Bedrock
  • Fix respect guardrail mock_response during during_call to return blo...
  • Fix session consistency, move Lasso API version away from source code
  • Fix Watsonx Audio Transcription API
  • Fix extra_headers in messages api bedrock invoke
  • Fix GA path for azure openai realtime models
  • Fix `litellm_enterprise` ensure imported routes exist
  • Fix new org team validate against org
  • Fix sso users not added to entra synced team
  • Fix AttributeError when metadata is null in request body
  • SSO(fix): Clear SSO integration for all users
  • Fix remove URL format validation for MCP server endpoints
  • Fix update default database connection number
  • Fix update default proxy_batch_write_at number
  • Fix 500 error for malformed request
  • Fix litellm user auth not passing issue
  • Fix Bedrock Guardrail Indent and Impo

🔧 Affected Symbols

websocketsspend/logsspend/logs/v2tool-permissionbedrockcohere v4 embed responseSambaNova modelsAzure passthroughveogemini modelsWatsonXfireworks_ai /kimi-k2-instruct-0905Guardrail API V2vllm batch+files APIjwt authbedrock OpenAI modelZ.AI (Zhipu AI)OVHcloud audio transcription

⚡ Deprecations

  • [Refactor] Deprecate `spend/logs` & add `spend/logs/v2`