LangChain
AI & LLMs🦜🔗 The platform for reliable agents.
Release History
langchain-huggingface==1.2.23 fixes1 featureThis release focuses on hardening HuggingFace integrations by improving repository ID validation and optimizing API calls, alongside updating model profile data and dependencies.
langchain-core==1.3.0a337 fixes12 featuresThis release introduces new features like enhanced metadata tracing, support for ChatBaseten, and multimodal token counting. It also includes numerous bug fixes related to SSRF hardening, model serialization, and tool handling, alongside the deprecation of prompt saving/loading functions due to added path validation.
langchain-classic==1.0.43 fixesThis release of langchain-classic (1.0.4) introduces deprecations, dependency bumps including pytest and cryptography, and fixes related to mypy errors and Azure AI Foundry model providers.
langchain-openai==1.1.141 fixVersion 1.1.14 of langchain-openai fixes an issue related to image token counting by implementing SSRF-safe transport. This release also updates several dependencies.
langchain-text-splitters==1.1.22 fixes1 featureThis release updates numerous dependencies, enforces SSRF-safe transport for URL splitting, and fixes a data loss issue in RecursiveJsonSplitter.
langchain-core==1.2.31This release ports change 36816 to the v1.2 branch, resulting in version 1.2.31 of langchain-core.
langchain-core==1.2.301 fixThis release primarily focuses on hardening private SSRF utilities within the core library.
langchain-openai==1.1.134 fixes1 featureThis release primarily focuses on bug fixes related to API response handling, token extraction, and User-Agent overrides, alongside adding a feature to impute placeholder filenames for OpenAI file inputs.
langchain-core==1.2.291 fixThis release updates langchain-core to version 1.2.29 and incorporates a fix from PR #36725.
langchain-core==1.3.0a235 fixes14 featuresThis release focuses heavily on stability, security hardening (anti-SSRF), and improved token counting, especially for multimodal inputs and tool schemas. Several prompt loading/saving methods have been deprecated due to path validation improvements.
langchain-core==1.3.0a135 fixes13 featuresThis release focuses heavily on stability, security hardening (anti-SSRF, pygments update), and feature parity, including adding Bedrock and OpenRouter support, and enhancing token counting for multimodal inputs and tool schemas. Several internal cleanups and deprecations related to prompt saving paths were also introduced.
langchain-core==0.3.841 fixThis release primarily focuses on improving prompt sanitization within the core library, moving from version 0.3.83 to 0.3.84.
langchain-core==1.2.281 fixThis release primarily focuses on improving template security by adding more sanitization.
langchain-tests==1.1.61 fixThis release primarily focuses on dependency updates, security patches (pygments), and minor fixes to standard tests for sandbox backends.
langchain-core==1.2.271 fixThis release primarily addresses a bug related to handling symlinks in the deprecated prompt saving path and updates a dependency comment.
langchain-ollama==1.1.04 fixes3 featuresThis release introduces new features for Ollama, including support for `response_format`, `dimensions` in embeddings, and logprobs, alongside several bug fixes and dependency updates.
langchain-core==1.2.262 fixes1 featureVersion 1.2.26 introduces a new serializable mapping for ChatBaseten and fixes serialization/validation issues for Bedrock models.
langchain==1.2.15This release primarily updates the dependency version for aiohttp within the langchain_v1 library.
langchain-core==1.2.252 fixesThis release (1.2.25) focuses on minor fixes, including hardening file checks in deprecated prompt loading and correcting documentation typos.
langchain-core==1.2.241 fix1 featureVersion 1.2.24 introduces placeholder filename imputation for OpenAI file inputs and updates the dependency constraint for Pygments.
langchain==1.2.145 fixes1 featureThis release focuses on dependency updates, performance improvements, and several bug fixes, including updates for Azure AI Foundry providers and recursion limit handling in agents.
langchain-openrouter==0.2.11 fixThis release bumps the dependency on 'requests' and fixes an issue in the OpenRouter integration where attribution headers were not being passed correctly.
langchain-core==1.2.231 fixThis release reverts a previous fix related to tracing invocation parameters and updates the 'requests' dependency version.
langchain-exa==1.1.01 featureThis release updates the default search type in langchain-exa to 'auto' and includes numerous dependency bumps across the project.
langchain-openrouter==0.2.01 fix1 featureThe OpenRouter integration received an update to include marketplace attribution fields, and core components received fixes for missing model profile fields.
langchain-core==1.2.221 fixVersion 1.2.22 introduces path validation for prompt saving and loading, leading to the deprecation of the underlying methods.
langchain-openai==1.1.127 fixes1 featureThis release (1.1.12) focuses on bug fixes for OpenAI integrations, including support for the phase parameter and fixing file descriptor leaks, alongside updates to model profiles.
langchain-core==1.2.211 fixThis release primarily addresses a bug fix in model profiles by adding missing fields and implementing schema drift warnings, alongside minor cleanup.
langchain==1.2.132 fixes1 featureThis release includes minor dependency bumps, CI improvements, and adds LangSmith integration metadata to agent and chat model initialization functions. A bug fix was also applied to the OpenAI Responses API input typing.
langchain-core==1.2.201 fix2 featuresVersion 1.2.20 of langchain-core includes security hardening, improved LangSmith integration metadata, and a fix for tracing invocation parameters.
langchain-anthropic==1.4.02 featuresVersion 1.4 introduces explicit caching for system messages and tool definitions via AnthropicPromptCachingMiddleware and improves cache control delegation.
langchain-anthropic==1.3.54 fixes1 featureThis release (1.3.5) for langchain-anthropic focuses on bug fixes related to caching, streaming, and tool choice handling, alongside dependency updates and new model profile features.
langchain-mistralai==1.1.23 fixes2 featuresThis release introduces new fields to model profiles and fixes several bugs related to exception handling and type resolution. Dependencies, including langsmith and urllib3, have also been updated.
langchain-classic==1.0.3BreakingThis release primarily involves internal housekeeping, moving BaseCrossEncoder to langchain-core and updating several minor and patch dependencies across the project.
langchain-core==1.2.19This release moves the BaseCrossEncoder class to langchain-core and updates the tornado dependency version.
langchain==1.2.121 featureThis release introduces tracing capabilities for wrapped models and tool calls within langchain_v1.
langchain==1.2.113 fixes2 featuresThis release introduces support for OpenRouter and automatic server-side compaction in OpenAI, alongside several dependency updates and bug fixes related to model initialization and detector output.
langchain-openai==1.1.11Breaking5 fixes2 featuresThis release introduces support for tool search and streaming token usage for OpenRouter, alongside several bug fixes related to model detection and structured output handling. It also bumps minimum dependency versions.
langchain-core==1.2.182 fixes1 featureThis release (1.2.18) focuses on minor fixes, including correcting docstrings and preserving schema factory settings, alongside adding tool search support for OpenAI.
langchain==0.3.28Breaking14 fixes6 featuresThis release bumps the minimum required version for `langchain-core` to 0.3.73 and patches a critical ReDoS vulnerability in MRKL/ReAct regexes. It also includes numerous internal cleanups, dependency bumps (like Ruff), and updates to model handling, particularly for Anthropic and Google models.
langchain-classic==1.0.27 fixes2 featuresThis release bumps `langchain-classic` to 1.0.2, patching a ReDoS vulnerability and including various dependency updates. New features include support for automatic server-side compaction in OpenAI integrations.
langchain-openrouter==0.1.03 fixes4 featuresThis release bumps the version of langchain-openrouter to 0.1.0 and introduces several new features, including streaming token usage and cost reporting for OpenRouter, alongside various infrastructure and profile updates.
langchain-core==1.2.171 fixVersion 1.2.17 of langchain-core includes a fix for extracting usage metadata from tracer messages and several dependency bumps.
langchain-huggingface==1.2.12 fixes1 featureThis release focuses on dependency updates, resolving compatibility issues with huggingface-hub 1.x, and introducing new fields to model profiles.
langchain-core==1.2.161 fixThis release of langchain-core includes a minor bug fix related to handling empty tool chunk IDs during merging.
langchain-anthropic==1.3.45 fixes2 featuresThis release focuses on improving the Anthropic integration by adding Bedrock support and a User-Agent header, alongside several fixes related to model IDs and response handling.
langchain-core==1.2.153 fixes1 featureThis release focuses on bug fixes in the core library, improved error handling, and the addition (and subsequent reversion) of the ChatAnthropicBedrock wrapper. Performance improvements were also made by deferring specific langsmith imports.
langchain-core==1.2.144 fixesThis release focuses primarily on bug fixes within the core library, addressing issues related to tool call merging, recursion errors, and LangSmith tracing parameter handling. Dependency groups were also updated.
langchain-text-splitters==1.1.14 fixes1 featureVersion 1.1.1 of langchain-text-splitters focuses on bug fixes, including resolving an SSRF vulnerability and mutation issues in splitters, alongside adding support for model_kwargs in SentenceTransformersTokenTextSplitter.
langchain-tests==1.1.5This release bumps several internal dependency groups, most notably updating langsmith to version 0.6.3, and releases version 1.1.5 of langchain-tests.
langchain-openai==1.1.105 fixes2 featuresThis release introduces support for automatic server-side compaction in OpenAI and adds the OpenRouter provider package. It also includes several bug fixes related to model properties, error handling, and dependency version bumps.
langchain-openrouter==0.0.2This release primarily involves bumping the core version and silencing a warning within the langchain-openrouter integration.
langchain-anthropic==1.3.32 fixes2 featuresThis release updates the Anthropic integration to support effort="max" and introduces a new ContextOverflowError for handling context limits across Anthropic and OpenAI integrations.
langchain-openai==1.1.91 fix2 featuresThis release introduces a new ContextOverflowError for handling context limits in Anthropic and OpenAI integrations, and includes a fix for sanitizing URLs during image token counting.
langchain-openai==1.1.81 fixThis patch release for langchain-openai (1.1.8) primarily includes a bug fix related to detecting codex models for API response preferences and various dependency and chore updates.
langchain-standard-tests==1.1.41 featureThis release primarily focuses on adding standard tests for sandbox providers and minor maintenance chores within the standard-tests package.
langchain-anthropic==1.3.22 fixes2 featuresThis release for langchain-anthropic primarily focuses on bug fixes and feature enhancements, including support for output_config and automatic compaction (Opus 4.6). It also includes dependency updates.
langchain-groq==1.1.21 fix1 featureVersion 1.1.2 of langchain-groq introduces support for LangChain image types and includes a core serialization fix. Dependencies were also updated.
langchain-standard-tests==1.1.3This release includes an initial version and a dependency upgrade for urllib3 to version 2.6.3.
langchain-xai==1.2.21 fixVersion 1.2.2 of langchain-xai primarily addresses a bug related to streaming API routing for chat completions and responses.
langchain-core==1.2.131 featureThis release introduces the new `langchain-openrouter` provider package and expands the documentation for `get_lc_namespace`.
langchain-core==1.2.121 fixThis release primarily contains a bug fix related to setting the text attribute on ChatGeneration objects.
langchain-core==1.2.112 fixesThis release primarily focuses on bug fixes, including sanitizing URLs for OpenAI token counting and improving exception handling in the tracer.
langchain==1.2.101 fixThis release primarily focuses on dependency updates and includes a fix for token counting issues with partial message sequences. An internal variable name was also updated.
langchain-core==1.2.103 featuresThis release introduces a new `ContextOverflowError` for better error handling in LLM integrations and enhances token counting to include tool schemas.
langchain==1.2.94 fixes2 featuresLangChain version 1.2.9 introduces support for state updates via `wrap_model_call` and threading context in agent flows, alongside several bug fixes related to schema normalization and token counting.
langchain-core==1.2.92 fixes1 featureThis release focuses on improving approximate token counting by allowing scaling based on reported usage and adjusting the capping mechanism. It also reverts a previous change regarding hex color regex precompilation.
langchain==1.2.81 fix1 featureThis release focuses on minor dependency upgrades, linting, and introduces `ToolCallRequest` to middleware exports, alongside a fix for agent factory name mismatches.
langchain-core==1.2.813 fixes2 featuresThis release focuses on numerous bug fixes across the core library, including improvements to token counting with multimodal support and adding an XML format option for buffer string generation. Dependency updates and various documentation enhancements were also included.
langchain==1.2.72 fixes1 featureLangChain version 1.2.7 introduces dynamic tool registration through middleware and includes minor fixes to the summarization prompt and system message grammar.
langchain==1.2.61 fixThis release primarily addresses a bug fix related to the signature and configuration invocation of the SummarizationMiddleware.
langchain==1.2.51 fix1 featureThis release updates the summarization prompt and fixes an issue by adding metadata configuration to the summarization model invocation.
langchain==1.2.46 fixes2 featuresThis release focuses heavily on internal type checking improvements across many test suites and introduces metadata for agent names. A key functional change is adding the `state` field to `_ModelRequestOverrides`.
langchain-core==0.3.831 featureThis release updates the core library to use uuid7 for generating run identifiers across several components.
langchain-core==0.3.82langchain-core==1.2.7langchain==1.2.32 fixes2 featuresThis release focuses on improving summarization logic by leveraging usage metadata and fixing tool call pairings, alongside a provider mapping fix for Azure OpenAI.
langchain==1.2.22 fixesLangChain 1.2.2 is a patch release focused on fixing parallel tool usage in planning middleware and improving internal test type safety.
langchain-openai==1.1.74 fixesThis release focuses on bug fixes for the OpenAI integration, specifically improving error handling for structured output refusals and refining token counting logic for function calls.
langchain==1.2.1Breaking5 fixes4 featuresThis release focuses on bug fixes for message summarization and tool schemas, adds Google GenAI support to embeddings, and introduces several linting and type-checking improvements.
langchain-anthropic==1.3.15 fixesVersion 1.3.1 is a maintenance release that addresses a security vulnerability (CVE-2025-68664) and fixes several bugs related to cache control and serialization.
langchain-xai==1.2.11 fixVersion 1.2.1 of langchain-xai fixes a bug in token usage reporting by ensuring reasoning tokens are included in the total output count.
langchain-core==1.2.6Breaking4 fixes3 featuresThis release focuses on internal performance optimizations, improved type safety, and bug fixes for ChatPromptTemplate and CallbackManager. It also updates internal tooling and documentation to reflect beta statuses and Pydantic v2 compatibility.
langchain-xai==1.2.04 fixes2 featuresThis release updates langchain-xai to 1.2.0 and langchain-openai to 1.1.6, focusing on streaming improvements, token counting fixes for GPT-5, and core serialization patches.
langchain-tests==1.1.21 fixThis release includes a critical serialization fix for langchain-core and internal code style improvements across standard-tests and text-splitters.
langchain-classic==1.0.1Breaking5 fixes6 featuresThis release updates langchain-classic to 1.0.1, featuring enhanced model initialization for Anthropic and Google GenAI, a transition to uuid7 for run IDs, and the removal of the Tigris provider.
langchain-core==0.3.811 fixThis release includes a fix for a serialization issue in langchain-core.
langchain-core==1.2.5Breaking6 fixes3 featuresThis release focuses on bug fixes for tool decorators and serialization, while introducing PEP 702 support for deprecations and automated tool call tracking.
langchain-core==1.2.41 fix1 featureThis release introduces usage metadata tracking in LangChainTracer and fixes a trace persistence issue for iterator-based inputs.
langchain-core==1.2.31 fix1 featureThis release introduces a fix for handling unknown blocks in OpenAI message conversion and adds infrastructure checks for lockfile consistency.
langchain-openai==1.1.61 featureVersion 1.1.6 updates the maximum input token limits for the gpt-5 series models.
langchain-openai==1.1.51 fixVersion 1.1.5 of langchain-openai fixes an issue with chunk_position by delegating its setting to langchain-core.
langchain-tests==1.1.11 featureVersion 1.1.1 of langchain-tests introduces more descriptive error messages for streaming cases to improve developer experience during testing.
langchain-core==1.2.21 fixVersion 1.2.2 is a maintenance release focusing on Python 3.14 compatibility and internal lockfile updates.
langchain-openai==1.1.42 fixesVersion 1.1.4 reverts a model selection change for flex usage and fixes documentation links for structured output.
langchain==1.2.04 fixes2 featuresThis release introduces a strict mode for structured outputs in ProviderStrategy and adds an extras field to BaseTool, alongside fixes for multithreading race conditions and HuggingFace model initialization.
langchain-core==1.2.11 fixThis patch release fixes a bug in tool call parsing when handling null arguments and improves type hinting for ToolCallChunk.
langchain-text-splitters==1.1.02 fixes3 featuresRelease 1.1.0 introduces R language support for text splitting and transitions to uuid7 for run IDs, while ensuring compatibility with Python 3.14.
langchain-openai==1.1.32 fixes1 featureThis release introduces support for the gpt-5.2-pro responses API and includes bug fixes for image resizing and API response handling.
langchain-huggingface==1.2.01 fix2 featuresRelease 1.2.0 focuses on fixing and improving the init_chat_model integration for HuggingFace backends and updating core dependencies.
Common Errors
ModuleNotFoundError4 reportsThis error usually means the Langchain library is not installed, or the installed version is outdated and missing the module. Fix it by installing or upgrading Langchain using pip: `pip install langchain --upgrade`. If you are using poetry: `poetry add langchain`
BadRequestResponseError2 reportsBadRequestResponseError in Langchain often arises from exceeding the allowed token limit or malformed requests sent to the LLM provider during multi-turn conversations, especially when streaming. To fix this, either reduce the prompt size by summarizing or truncating earlier conversation turns, or ensure proper formatting and encoding of the request payload, especially when handling streamed responses containing structured data.
OutputParserException2 reportsOutputParserException in Langchain often arises from mismatches between the LLM's generated output and the expected format defined by your OutputParser. To fix it, carefully inspect the LLM output for inconsistencies (incorrect field names, wrong data types, missing fields), and adjust either your prompt to guide the LLM to produce the correct format or modify your OutputParser to be more tolerant of variations. Consider adding error handling, such as a try-except block around the parser, to gracefully manage parsing failures.
BadRequestError2 reportsBadRequestError in Langchain often arises from exceeding the token limit of the chosen language model or providing incorrectly formatted API requests. Resolve this by shortening the input text, adjusting max_tokens in your ChatOpenAI/ChatAnthropicVertex constructor, or verifying that your prompts and API calls adhere to the model's expected input format. If using an agent, refine the prompt to encourage concise reasoning.
NotImplementedError2 reportsThe "NotImplementedError" in Langchain usually arises when a base class or interface method is called without a concrete implementation in a subclass or a mock. To fix it, ensure the specific method throwing the error (e.g., `profile` access or structured output handling) is properly implemented in your custom class or ensure you are not using unsupported features with mock/fake objects. If using a base class, implement the missing method; if using a mock, avoid calling unimplemented functions.
TypeError1 reportTypeError in Langchain often arises from incorrect type annotations, especially when converting TypedDicts (possibly containing `NotRequired`) into formats expected by external APIs like OpenAI function schemas. To fix this, carefully inspect and adjust your TypedDict definitions to ensure they are compatible with the target schema, potentially mapping `NotRequired` fields to optional types or providing default values to avoid unexpected `None` values during conversion. Verify that custom classes used in schemas have appropriate `__init__` methods or pydantic configurations.
Related AI & LLMs Packages
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.
The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
LLM inference in C/C++
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
A high-throughput and memory-efficient inference and serving engine for LLMs
Subscribe to Updates
Get notified when new versions are released