v1.82.0-nightly
📦 litellmView on GitHub →
✨ 18 features🐛 29 fixes⚡ 1 deprecations🔧 22 symbols
Summary
This release introduces significant UI features related to Projects and Access Groups, adds several new models and cost map entries, and includes numerous performance optimizations and bug fixes across proxy, caching, and model handling.
Migration Steps
- If using older xAI models grok-2-vision-1212 or grok-3-mini, note their deprecation dates and plan migration.
- If encountering 400 errors across proxy workers, ensure stale mcp-session-id headers are stripped (this is now fixed).
✨ New Features
- Added support for gemini-3.1-flash-image-preview model on Vertex AI.
- Added v1 transformation for Anthropic responses.
- Enabled local file support for OCR.
- Implemented Access group CRUD with bidirectional team/key synchronization.
- Added gpt-audio-1.5 and gpt-realtime-1.5 to the model cost map.
- Added OpenRouter native models to the model cost map.
- Included timestamps in the /project/list response.
- Introduced UI for Projects with list and create flows.
- Added Prometheus in_flight_requests metric to /health/backlog endpoint and Prometheus.
- Implemented Agent RBAC Permission Fix to ensure internal users cannot create agents.
- Added project_id and access_group_id filters to the Key list endpoint.
- Introduced UI for Project Details Page.
- Added project keys table and project dropdown to key create/edit UI.
- Added ability to trace metrics to Datadog.
- Added duplicate issue detection and auto-close bot to CI.
- Introduced Litellm presidio stream v3.
- Added delete project action to UI - Projects.
- Added UI - Admi...
🐛 Bug Fixes
- Fixed converse handling for parallel_tool_calls.
- Preserved forwarding server side called tools.
- Fixed free models working correctly from the UI.
- Added ChatCompletionImageObject in OpenAIChatCompletionAssistantMessage.
- Fixed poetry lock issues.
- Stripped stale mcp-session-id to prevent 400 errors across proxy workers.
- Fixed function calling for PublicAI Apertus models.
- Fixed Claude code plugin schema.
- Added missing migration for LiteLLM_ClaudeCodePluginTable in the database.
- Restored parallel_tool_calls mapping in map_openai_params for Bedrock.
- Updated test mocks for renamed filter_server_ids_by_ip_with_info in MCP.
- Added PROXY_ADMIN role to system user for key rotation.
- Populated user_id and user_info for admin users in /user/info.
- Fixed LLMClientCache._remove_key to store task references.
- Propagated extra_headers to Upstream for image generation.
- Passed MCP auth headers from request into tool fetch for /v1/responses and chat completions.
- Shortened guardrail benchmark result filenames for Windows long path support.
- Fixed MCP to default available_on_public_internet to true.
- Fixed filtering of internal json_tool_call when mixed with real tools in Bedrock.
- Fixed OIDC discovery URLs, roles array handling, and dot-notation error hints in JWT handling.
- Fixed TTS metrics issues.
- Fixed update_price_and_context_window workflow from running in forks.
- Removed duplicate env key in scan_duplicate_issues workflow.
- Suppressed PLR0915 in complex transform methods during linting.
- Implemented atomic RPM rate limiting in model rate limit check.
- Fixed isolating get_config failures from the model sync loop in the proxy.
- Fixed CI to handle inline table in pyproject.toml for proxy-extras version check.
- Bumped litellm-proxy-extras to 0.4.50 and fixed 3 failing tests.
- Resolved flaky tests from leaked @tremor/react Tooltip timer in UI.
Affected Symbols
gemini-3.1-flash-image-previewgpt-audio-1.5gpt-realtime-1.5grok-2-vision-1212grok-3-miniOpenRouter Opus 4.6Claude Opus 4.6Universal-3 ProSpeech UnderstandingLLM GatewayChatCompletionImageObjectOpenAIChatCompletionAssistantMessageLLMClientCachefilter_server_ids_by_ip_with_infoLiteLLM_ClaudeCodePluginTableparallel_tool_callsmap_openai_paramsextra_headersjson_tool_callOIDC discovery URLsin_flight_requestspyproject.toml
⚡ Deprecations
- Added deprecation dates for xAI grok-2-vision-1212 and grok-3-mini models.