Instructor
AI & LLMsstructured outputs for llms
Release History
v1.14.11 fix1 featureThis patch release introduces support for Google GenAI context caching via the cached_content parameter.
v1.14.06 fixes4 featuresThis release focuses on standardizing provider factory methods and exception handling, while adding Bedrock document support and fixing critical bugs in GenAI, OpenAI, and Cohere integrations.
v1.13.05 fixes2 featuresThis release introduces image support for Bedrock, improves type safety with a py.typed marker, and includes critical fixes for Gemini streaming and Anthropic tool blocks.
v1.12.09 fixes6 featuresThis release introduces enhanced retry tracking, per-call hooks, and xAI streaming support while fixing critical bugs in OpenAI JSON mode and Gemini response handling. It also marks the transition from validation_context to a unified context parameter.
v1.11.31 fix3 featuresThis release introduces enhanced retry tracking, per-call hook support, and llms.txt documentation support, while fixing multimodal import issues.
1.11.22 fixes2 featuresThis release enhances Google Cloud Storage support for multimodal data types and restores backwards compatibility for exception imports.
v1.11.0Breaking3 fixes5 featuresThis release introduces a major modular reorganization of the codebase, adds support for xAI, OpenRouter, and Truefoundry providers, and implements in-memory batching.
1.10.0Breaking5 fixes7 featuresThis release introduces native caching (Redis/AutoCache), expands provider support to include DeepSeek and Anthropic parallel tools, and migrates Google integrations to the new google-genai SDK.
1.9.24 fixes1 featureThis release introduces support for the xAI provider and includes several bug fixes for Gemini API safety settings and GenAI image harm categories.
1.9.14 fixes2 featuresThis release introduces Azure OpenAI support and simplifies Gemini safety configurations while fixing public API visibility for exceptions and JSON schema issues.
1.9.0Breaking6 fixes7 featuresThis release introduces Ollama and Writer provider support, improves Gemini and Anthropic integrations, and standardizes VertexAI async parameters. It also enhances error handling with a new exception hierarchy and resolves several dependency conflicts.
1.8.34 fixes4 featuresRelease 1.8.3 introduces support for asynchronous Bedrock clients and response handling, alongside various bug fixes for the Bedrock converse endpoint and documentation improvements.
1.8.21 fixThis patch release removes a stray print statement to clean up console output.
1.8.12 fixes2 featuresRelease 1.8.1 introduces a unified provider interface and enables streaming support directly within the create method, alongside fixes for Anthropic web search.
1.8.06 fixes1 featureThis release introduces a unified provider interface with string-based initialization and includes several bug fixes for Google GenAI and Python 3.10 type compatibility.
1.7.91 fix3 featuresThis release introduces async partial streaming for Gemini, adds Mistral PDF support, and improves type hinting for LiteLLM integrations.
1.7.81 fix4 featuresThis release introduces streaming support for Mistral and VertexAI, fixes a filename length bug in Google GenAI, and significantly expands documentation including Cursor rules and llms.txt support.
1.7.71 fix1 featureVersion 1.7.7 introduces SambaNova examples for both sync and async workflows and includes minor dependency fixes.
1.7.61 fixThis patch release addresses an incorrect import issue discovered in version 1.7.5.
1.7.54 featuresThis release introduces support for Mistral Structured Outputs and the Google GenAI SDK, alongside documentation improvements for SQL models and contributing guidelines.
1.7.42 fixes1 featureThis release introduces support for Open Router, updates the Anthropic dependency, and includes several documentation and testing fixes.
1.7.33 fixes7 featuresThis release introduces support for AWS Bedrock and Perplexity Sonar, adds Claude 3.7 Sonnet reasoning support, and defaults Gemini to JSON mode. It also includes various documentation improvements and a new utility to strip control characters from LLM outputs.