Change8

v1.81.14-nightly

📦 litellmView on GitHub →
16 features🐛 27 fixes🔧 42 symbols

Summary

This release introduces support for Gemini 3.1 Pro Preview, adds various policy template features, and includes extensive bug fixes across proxying, CI/CD workflows, and model parameter handling. Several tests requiring external services were skipped to improve CI stability.

Migration Steps

  1. Uncomment response_model in user_info endpoint.

✨ New Features

  • [Feat] Add Default usage data configuration
  • [Feat] Add server side compaction translation from openai to anthropic
  • Add method based routing for passthrough endpoints
  • [Feat]Add gemini 3.1 pro preview day 0 support
  • [Feature] Key Last Active Tracking
  • feat: AI policy template suggestions
  • Add OpenAPI-to-MCP support via API and UI
  • feat: add airline off-topic restriction policy template
  • feat(policy): test playground for AI policy suggestions
  • feat: prompt injection guardrail policy template
  • feat(ui): show latency overhead for AI-suggested policy templates
  • feat(ci): auto-approve and auto-merge the regenerated poetry.lock PR
  • [Fix] Service Account Visibility for Team Members
  • support reasoning and effort parameters on sonnet 4.6
  • [Feat] Add reasoning support via config
  • [Fix]Add mcp via openapi spec

🐛 Bug Fixes

  • fix: remove list-to-str transformation from dashscope
  • fix: allow github aliases to reuse upstream model metadata
  • fix(proxy): prevent is_premium() debug log spam on every request
  • Convert thinking_blocks to content blocks for hosted_vllm multi-turn
  • Fix usage in xai
  • Fix: add stop param as supported for openai and azure
  • fix(websearch_interception): fix pre_call_deployment_hook not triggering via proxy router
  • fix(constants): add env var override support for COMPETITOR_LLM_TEMPERATURE and MAX_COMPETITOR_NAMES
  • fix(types): fix mypy errors in pass-through endpoint query param types
  • fix: handle deprovisioning operations without path field
  • fix(bedrock): add Accept header for AgentCore MCP server requests
  • fix: reduce proxy overhead for large base64 payloads
  • fix(key management): return failed_tokens in delete_verification_tokens response
  • fix: handle explicit None model_info in LowestLatencyLoggingHandler
  • fix(lint): remove unused imports in semantic_guard and policy_endpoints
  • fix(tests): set premium_user=True in JWT tests that call user_api_key_auth
  • fix(anthropic): empty system messages in translate_system_message
  • fix(tests): pass host to RedisCache in test_team_update_redis
  • fix(policy): use litellm.acompletion directly in AiPolicySuggester
  • fix(lint): remove redundant router import in policy_endpoints __init__
  • Fix _map_reasoning_effort_to_thinking_level for all gemini 3 family
  • Fix: api_base is required. Unable to determine the correct api_base for the request
  • Fix mapping of parallel_tool_calls for bedrock converse
  • fix(tests): update test_max_effort_rejected_for_opus_45 regex to match current error message
  • fix(policy_endpoints): re-export private helper functions from package __init__.py
  • fix(tests): set premium_user=True in test_aasync_call_with_key_over_model_budget
  • fix(tests): correct medium reasoning_effort assertion for gemini-3-pro-preview for gemini-3-pro-preview

Affected Symbols