Change8

v1.78.7-nightly

📦 litellm
12 features🐛 8 fixes🔧 13 symbols

Summary

This release introduces extensive support for various search APIs (Perplexity, Tavily, Parallel AI, EXA AI) and adds Guardrails support. Several bug fixes address issues related to Ollama parsing, budget enforcement, and configuration renaming.

✨ New Features

  • Added support for GraySwan Guardrails.
  • Added SENTRY_ENVIRONMENT configuration for Sentry integration.
  • Added mode and Health check support for OCR models via /ocr endpoint.
  • Added def search() APIs for Web Search using Perplexity API.
  • Added Tavily Search API support.
  • Added Parallel AI - Search API support.
  • Added EXA AI Search API support.
  • Added imageConfig parameter support for gemini-2.5-flash-image.
  • Added /search endpoint on LiteLLM Gateway.
  • Added ability to set authentication on passthrough endpoints via the UI.
  • Added support for prompt caching for Anthropic Claude on Databricks.
  • Added support for embeddings_by_type Response Format in Bedrock Cohere Embed v1.

🐛 Bug Fixes

  • Fixed OAuth authorization endpoint by adding response_type + PKCE parameters.
  • Fixed Auth Header issue for MCP Tool Call.
  • Fixed Ollama chunk parsing error related to issue #13333.
  • Fixed reasoning item ID auto-generation causing encrypted content verification errors.
  • Fixed the date for Claude 3.7 Sonnet in govcloud.
  • Renamed configured_cold_storage_logger to cold_storage_custom_logger.
  • Applied max_connections configuration to Redis async client.
  • Fixed pass-through endpoint budget enforcement bug.

🔧 Affected Symbols

OllamaSentryOCR modelsPerplexity APITavily Search APIParallel AI Search APIEXA AI Search APIgemini-2.5-flash-imageLiteLLM Gateway /search endpointPassthrough endpointsAnthropic Claude on DatabricksBedrock Cohere Embed v1Redis async client