v1.82.3.dev.2
📦 litellmView on GitHub →
✨ 4 features🐛 21 fixes🔧 20 symbols
Summary
This release focuses heavily on bug fixes across proxy functionality, UI components, and provider integrations, alongside infrastructure updates like dependency regeneration and code formatting. New features include enhanced team listing access control and proxy-wide rate limits.
Migration Steps
- Add IF NOT EXISTS clause when creating indexes in migrations.
✨ New Features
- Added org admin access control, members_count, and indexes to the /v2/team/list endpoint.
- Added external link icon to Learning Resources in the UI Leftnav.
- Support ANTHROPIC_AUTH_TOKEN and ANTHROPIC_BASE_URL environment variables for Anthropic provider.
- Add proxy-wide default API key TPM/RPM limits.
🐛 Bug Fixes
- Auto-recover shared aiohttp session when closed.
- Model-level guardrails now execute correctly for non-streaming post_call.
- Prevented duplicate callback logs for pass-through endpoint failures.
- Fixed Langfuse otel traceparent propagation.
- Guardrail Mode Type Crash on Non-String Values in UI Logs fixed.
- Added Missing Permission Options in UI - Default Team Settings.
- Fixed /key/block and /key/unblock endpoints to return 404 (instead of 401) for non-existent keys.
- Fixed Key Update Endpoint to Return 404 Instead of 401 for Nonexistent Keys.
- Surface Anthropic code execution results as code_interpreter_call in Responses API.
- Fixed thinking blocks being dropped when the thinking field is null.
- Preserve router model_group in generic API logs.
- Fixed proxy only failure call type issue.
- Populated usage_metadata in outputs for the Cost column in Langsmith.
- Fixed model repetition detection performance.
- Fixed logging for response incomplete streaming + custom pricing on /v1/messages and /v1/responses.
- Fixed missing 'id' field in streaming chunks for MiniMax when using openai provider.
- Use AZURE_DEFAULT_API_VERSION for proxy --api_version default.
- Deferred logging until post-call guardrails complete in proxy.
- Killed orphaned prisma engine subprocess on failed disconnect in proxy.
- Short-circuited websearch for github_copilot provider.
- Resolved recursion in OVHCloud get_supported_openai_params.
Affected Symbols
aiohttp session/v2/team/listLangfuse otel traceparentGuardrail Mode Type/key/block/key/unblockKey Update EndpointAnthropic code execution resultsResponses APIrouter model_grouplangsmith usage_metadatamodel repetition detectionresponse incomplete streaming/v1/messages/v1/responsesMiniMax streaming chunksproxy --api_versionprisma engine subprocesswebsearch for github_copilot providerOVHCloud get_supported_openai_params