v1.77.7.dev3
📦 litellmView on GitHub →
✨ 14 features🐛 24 fixes⚡ 1 deprecations🔧 15 symbols
Summary
This release introduces several new features, including support for dynamic client registration, OpenTelemetry context propagation, and FAL AI Image Generations. Numerous bug fixes address issues across Azure OpenAI, Guardrails, logging, and UI components.
✨ New Features
- Upgraded Lasso API to v3 and fixed ULID generation.
- Enabled OpenTelemetry context propagation by external tracers.
- Added support for dynamic client registration.
- Added support for `during_call` for model armor guardrails.
- Added base_url configuration with environment variables for OpenRouter.
- Added handling for `v1` under Azure API versions for Azure OpenAI.
- Respect `LiteLLM-Disable-Message-Redaction` header for Responses API.
- Changed API Base from Select to Input in New LLM Credentials UI.
- Added /openai routes for Responses API (Azure OpenAI SDK Compatibility).
- Added GitlabPromptCache and enabled subfolder access.
- Config Models should not be editable in the UI.
- Added Apply Guardrail Testing Playground in the UI.
- Added FAL AI Image Generations support.
- Added Mistral API model codestral-embed-2505.
🐛 Bug Fixes
- Fixed ULID generation in Lasso API upgrade.
- Fixed duplicate trace in langfuse_otel.
- Fixed IBM Guardrails to correctly use SSL Verify argument.
- Fixed ContextWindowExceededError not being mapped from Azure OpenAI errors.
- Ensured key's metadata + guardrail is logged on Datadog (DD).
- Ensured error information is logged on OpenTelemetry (OTEL).
- Fixed minor proxy issue where User API key and team id and user id missing from custom callback was misfiring.
- Removed limit from admin UI numerical input fix.
- Added Key Already Exist Error Notification in UI.
- Preserved Bedrock inference profile IDs in health checks.
- Supported tool usage messages with Langfuse OTEL integration.
- Added Haiku 4.5 pricing for OpenRouter.
- Enhanced requester metadata retrieval from API key auth for Opik.
- Graceful degradation for pillar service when using LiteLLM.
- Ensured Key Guardrails are applied.
- Added Base64 handling for SQS Logger.
- Fixed mutation of original request for Gemini request.
- Redacted reasoning summaries in ResponsesAPI output when message logging is disabled.
- Supported text.format parameter in Responses API for providers without native ResponsesAPIConfig.
- Removed unnecessary model variable assignment.
- Prevented memory leaks from jitter and frequent job intervals in apscheduler.
- Allowed using ARNs when generating images via Bedrock.
- Added fallback logic for detecting file content-type when S3 returns generic.
- Prevented httpx DeprecationWarning memory leak in AsyncHTTPHandler.
🔧 Affected Symbols
Lasso API v3OpenTelemetry context propagationIBM GuardrailsContextWindowExceededErrorAzure OpenAILiteLLM-Disable-Message-Redaction headerResponses APIBedrock inference profile IDsLangfuse OTEL integrationGitlabPromptCacheCustomLLM subclassesGemini requestSQS Loggerapschedulerhttpx.AsyncHTTPHandler⚡ Deprecations
- Model deprecation dates are now added.