Change8

HuggingFace Hub

Dev Tools

The official Python client for the Hugging Face Hub.

Latest: v0.36.261 releases9 breaking changes18 common errorsView on GitHub

Release History

v0.36.21 fix
Feb 6, 2026

This patch release fixes a critical bug related to file corruption during download retries when the server misbehaves regarding the Range header.

v1.4.11 fix
Feb 6, 2026

This patch release addresses a critical bug related to file corruption during download retries when the server misbehaves regarding the Range header.

v1.4.010 features
Feb 3, 2026

This release introduces significant CLI enhancements, including a new `hf skills add` command for AI assistants and new commands for managing papers and collections. It also adds a decentralized evaluation results module with corresponding Python helpers and improves job execution support for multi-GPU training.

v1.4.0.rc04 fixes10 features
Feb 3, 2026

This release introduces significant enhancements to the CLI, including a new skill installation command (`hf skills add`), improved help output, and new commands for managing papers and collections. It also adds a decentralized evaluation results module to the Hub with corresponding Python helpers.

v1.3.71 fix
Feb 2, 2026

This patch release includes a minor bug fix related to logging HTTP error headers.

v1.3.51 feature
Jan 29, 2026

This release updates the default httpx timeout to use the HF_HUB_DOWNLOAD_TIMEOUT environment variable, providing flexibility for CI environments experiencing request timeouts.

v1.3.41 fix
Jan 26, 2026

This release fixes a minor regression in CommitInfo by setting the default value of _endpoint to None.

v1.3.34 fixes4 features
Jan 22, 2026

This release introduces the ability to list available hardware for Hugging Face Jobs through both CLI and API, alongside several bug fixes related to streaming performance and file system resolution. The maximum file size limit has also been increased.

v1.3.21 fix1 feature
Jan 14, 2026

This release introduces text-to-image support for the zai-org provider and fixes a bug related to endpoint forwarding in CommitUrl.

v1.3.1
Jan 9, 2026
v1.3.0
Jan 8, 2026
v1.3.0.rc0Breaking5 fixes8 features
Jan 8, 2026

This release significantly reorganizes the CLI with dedicated discovery commands (`hf models`, `hf datasets`, `hf spaces`) and introduces real-time job monitoring via `hf jobs stats`. A breaking change removes the unused `direction` parameter from repository listing methods.

v1.2.43 fixes1 feature
Jan 6, 2026

Version 1.2.4 fixes several bugs, including an issue with create_repo and job-related API endpoints, and adds the @dataclass_transform decorator to dataclass_with_extra.

v1.2.31 fix
Dec 12, 2025

This patch release fixes a bug in the CLI where new repository creation incorrectly defaulted to private=False instead of private=None, aligning CLI behavior with the API.

v1.2.22 fixes
Dec 10, 2025

This patch release addresses two specific bugs: one related to reading corrupted metadata files and another concerning the display of the HF_TOKEN message in the authentication list.

v1.2.0Breaking6 fixes8 features
Dec 4, 2025

This release significantly enhances rate limit handling with automatic retries and clearer errors, introduces the daily papers endpoint, and adds OVHcloud as a new Inference Provider. It also includes a breaking change where several access request listing functions now return iterators instead of lists.

v1.1.71 feature
Dec 1, 2025

This release introduces a convenient top-level import for HfFileSystem, now accessible as `huggingface_hub.hffs`.

v1.1.64 fixes
Nov 28, 2025

This release focuses on stability, delivering several bug fixes across repository downloading, error handling, and CLI functionality.

v1.1.53 fixes2 features
Nov 20, 2025

This release introduces OVHcloud AI Endpoints as a new Inference Provider and significantly speeds up CLI installation by integrating `uv` support. Several minor bug fixes address issues in collections, CLI debugging, and inference parsing.

v1.1.4Breaking1 feature
Nov 13, 2025

Version 1.1.4 introduces pagination for `list_user_access` results, which required a necessary breaking change to align with the updated server API.

v1.1.31 fix1 feature
Nov 13, 2025

This patch release introduces an optional 'name' parameter for catalog deployment and resolves rate-limiting issues encountered during large dataset downloads.

v1.1.02 fixes8 features
Nov 5, 2025

This release focuses on optimizing the file download experience through multi-threading and cleaner CLI output, while significantly expanding CLI capabilities with new commands for managing Inference Endpoints and verifying cache integrity. New support for WaveSpeedAI and image segmentation on fal are also introduced.

v1.0.1Breaking1 fix
Oct 28, 2025

This patch release removes the lingering dependency on `aiohttp` from the `huggingface_hub[inference]` extra and cleans up an unused internal method.

v1.0.0Breaking2 fixes8 features
Oct 24, 2025

Hugging Face Hub library v1.0 introduces a major migration to use httpx for HTTP requests, dropping requests/aiohttp dependencies, and completely revamps the CLI experience, replacing `huggingface-cli` with the new `hf` command.

v0.36.05 fixes3 features
Oct 23, 2025

This release focuses on significant performance optimizations for HfFileSystem and introduces the new get_organization_overview API endpoint, serving as the final minor release before v1.0.0.

v0.35.32 fixes
Sep 29, 2025

This release addresses two specific bug fixes related to fal-ai image-to-image inference and Tiny-Agents tool usage.

v0.35.21 fix2 features
Sep 29, 2025

This release introduces Z.ai as a new official Inference Provider on the Hub and includes an optimization for file system glob operations.

v0.35.12 fixes
Sep 23, 2025

This patch release adjusts the retry logic to only target 5xx errors and fixes an issue with unresolved forward references in strict dataclasses.

v0.35.012 fixes8 features
Sep 16, 2025

This release introduces Scheduled Jobs for running compute tasks on a regular basis via a new CLI interface, alongside significant updates to the Inference Client supporting Image-to-Video and new providers. Several bug fixes and maintenance updates were also applied across the CLI and Jobs API.

v0.34.62 features
Sep 16, 2025

This release introduces PublicAI as a new supported inference provider, allowing users to interact with models hosted by this nonprofit organization via InferenceClient.

v0.34.51 feature
Sep 15, 2025

This release introduces Scaleway as a new supported inference provider, allowing users to access models hosted on their infrastructure via the InferenceClient.

v0.34.41 fix7 features
Aug 8, 2025

This release introduces support for the Image-To-Video inference task using Fal AI and includes several quality-of-life improvements across job handling, environment dumping, and repository uploads.

v0.34.31 feature
Jul 29, 2025

This release updates the uv image for Jobs and improves the whoami command output by adding a 'user:' prefix.

v0.34.22 fixes
Jul 28, 2025

This patch release addresses two specific bugs: one related to path extension on Windows and another concerning incorrect total size reporting during download resumption.

v0.34.12 fixes
Jul 25, 2025

Version 0.34.1 primarily addresses stability issues in file downloading utilities, fixing potential failures with private tokens and race conditions during concurrent downloads.

v0.34.012 fixes13 features
Jul 24, 2025

This release introduces 'Jobs', a powerful new CLI and Python API for running compute tasks on Hugging Face infrastructure using Docker-like commands. Additionally, the Hugging Face CLI has been officially renamed from `huggingface-cli` to `hf`, and the InferenceClient gained image-to-image support.

v0.33.51 fix
Jul 24, 2025

This patch release addresses a specific UserWarning related to open sessions when using AsyncInferenceClient for streaming operations.

v0.33.41 feature
Jul 11, 2025

This release introduces an update to tiny-agent by omitting parameters in its default tools. More details are available in the linked pull request.

v0.33.3
Jul 11, 2025

This release primarily updates the tiny-agents example and provides a link to the full comparison between versions.

v0.33.2Breaking1 fix1 feature
Jul 2, 2025

This release updates the Tiny-Agent integration to use the VSCode MCP format, requiring significant changes to configuration structure, including flattening nested mappings and moving request headers to the root level.

v0.33.13 fixes1 feature
Jun 25, 2025

This patch release (v0.33.1) focuses on bug fixes related to payload preparation and inference endpoint health checks, alongside adding tool call support to Tiny agents messages.

v0.33.011 fixes4 features
Jun 11, 2025

This release introduces two major new inference providers, Featherless.AI and Groq, significantly enhancing model accessibility and inference speed. It also brings several quality-of-life improvements, bug fixes across various components, and advancements for local agent execution via MCP and tiny-agents.

v0.32.61 fix
Jun 11, 2025

This patch release fixes a bug related to the incorrect saving of upload_mode and remote_oid parameters.

v0.32.52 features
Jun 10, 2025

This minor release introduces the ability to inject environment variables into headers and enhances the codebase with better type annotations.

v0.32.43 fixes
Jun 3, 2025

This release focuses on bug fixes, specifically addressing issues in `asyncio` usage, token yielding, and the `InferenceClient.question_answering` method.

v0.32.32 fixes1 feature
May 30, 2025

This release focuses on improvements and bug fixes for the tiny-agents feature, including better environment variable handling and CLI stability.

v0.32.21 fix1 feature
May 27, 2025

Version 0.32.2 introduces local/remote endpoint inference support and resolves a critical bug affecting snapshot downloads for extremely large repositories.

v0.32.11 fix
May 26, 2025

This is a patch release focused on fixing a specific reported issue (#3116).

v0.32.01 fix7 features
May 22, 2025

This release introduces powerful new capabilities for LLM interaction via the Model Context Protocol (MCP) Client and Tiny Agents CLI, alongside support for new inference providers and enhanced dataclass validation with the @strict decorator.

v0.31.41 fix2 features
May 19, 2025

This release introduces new `strict` decorators for dataclass validation and adds DTensor support to storage size helpers, alongside several bug fixes.

v0.31.21 feature
May 13, 2025

This patch release focuses on making the `hf-xet` dependency optional, improving installation flexibility. More context is available in related pull requests.

v0.31.0Breaking8 fixes9 features
May 6, 2025

This release introduces major enhancements to Inference Providers, adding support for LoRA inference via fal.ai and Replicate, and enabling 'auto' provider selection as the new default. Additionally, Xet uploads now support byte arrays, and large file downloads (>50GB) are more reliable.

v0.30.22 fixes
Apr 8, 2025

This patch release addresses several bugs related to the InferenceClient, specifically fixing text-generation with external providers and conversational handling in HfInference.

v0.30.11 fix
Mar 31, 2025

This patch release addresses a specific bug reported in the Hugging Face Hub repository.

v0.30.0Breaking4 fixes12 features
Mar 28, 2025

This major release introduces Xet, a new protocol for storing large objects in Git, and significantly enhances the InferenceClient with new providers (Cerebras, Cohere, Novita) and organizational billing support. New features also include wildcard support for CLI uploads and programmatic LFS file management.

v0.29.32 features
Mar 11, 2025

This release introduces client-side support for the Cerebras and Cohere providers in preparation for their official launch on the Hub.

v0.29.22 fixes
Mar 5, 2025

This patch release (v0.29.2) addresses two specific bugs concerning payload model naming when using URLs and ensuring sys.stdout is restored after errors in notebook_login.

v0.29.12 fixes
Feb 20, 2025

This patch release (v0.29.1) addresses two specific bugs related to large folder uploads and the inference endpoint waiting mechanism.

v0.29.09 fixes6 features
Feb 18, 2025

This release significantly expands serverless inference capabilities by adding official support for Fireworks AI, Hyperbolic, Nebius AI Studio, and Novita providers. It also includes several quality-of-life improvements and deprecates some legacy hf-inference specific features.

v0.28.11 fix
Jan 30, 2025

This patch release fixes a critical bug introduced in v0.28.0 related to setting the HF_ENDPOINT environment variable with subpaths.

v0.28.0Breaking10 fixes5 features
Jan 28, 2025

This release introduces major updates to the InferenceClient, enabling unified inference across multiple third-party providers using Hugging Face Hub model IDs. Additionally, HfApi has been updated with new repository properties and the deprecated `like` endpoint has been removed.

Common Errors

LocalEntryNotFoundError4 reports

The "LocalEntryNotFoundError" in huggingface-hub often means a requested file isn't cached locally. Ensure you have internet connectivity to download the file. If the file should exist, check that you're using the correct `repo_id` and `filename`, and consider clearing your cache with `huggingface-cli delete-cache` if necessary.

FileNotFoundError4 reports

FileNotFoundError in huggingface-hub often arises due to temporary network issues during file downloads or corrupted local cache files. Clearing your local cache (using `huggingface-cli cache purge`) and retrying the download is the primary solution. Implement retry mechanisms in your code to automatically handle intermittent network errors, especially when downloading large files.

BadRequestError3 reports

The "BadRequestError" in huggingface_hub often arises when the API receives an improperly formatted request, such as incorrect data types or missing required fields in the input. Inspect your request payload (e.g., 'inputs' parameter) to ensure it adheres to the API's expected schema, typically a dictionary with specific key-value pairs. Double-check your code for instances where None or unexpected objects are being passed as input and correct them according to the API documentation before resubmitting the request.

ReadTimeoutError3 reports

ReadTimeoutError in huggingface-hub usually arises from network instability or server-side delays when downloading or accessing large files/repositories. To fix this, configure the `HF_HUB_DOWNLOAD_TIMEOUT` environment variable with a higher value (in seconds) to allow more time for the download to complete. Alternatively, if using code directly, increase the `timeout` parameter within relevant functions like `hf_hub_download` to an appropriate value for your network conditions.

HTTPError3 reports

HTTPError in huggingface-hub usually indicates a problem on the Hugging Face Hub's server-side (e.g., server overload, maintenance). Your best course of action is to retry the operation after a short delay (e.g., using exponential backoff) as the server might recover quickly. If the issue persists for an extended period, check the Hugging Face Hub's status page or community forums for updates or report the issue.

RepositoryNotFoundError3 reports

The "RepositoryNotFoundError" in huggingface_hub usually indicates that the repository you're trying to access (model, dataset, space) does not exist at the specified location, or you lack the necessary permissions. To fix this, double-check the repository name, its owner/organization, and ensure you have the correct authentication token if it's a private repository; verify that the repository actually exists on the Hub. If the problem persists, ensure your machine (such as the runner on a CI/CD) is authorized to access the repository by logging in with `huggingface-cli login` or setting the `HF_TOKEN` environment variable.

Related Dev Tools Packages

Subscribe to Updates

Get notified when new versions are released

RSS Feed