HuggingFace Hub
Dev ToolsThe official Python client for the Hugging Face Hub.
Release History
v0.36.21 fixThis patch release fixes a critical bug related to file corruption during download retries when the server misbehaves regarding the Range header.
v1.4.11 fixThis patch release addresses a critical bug related to file corruption during download retries when the server misbehaves regarding the Range header.
v1.4.010 featuresThis release introduces significant CLI enhancements, including a new `hf skills add` command for AI assistants and new commands for managing papers and collections. It also adds a decentralized evaluation results module with corresponding Python helpers and improves job execution support for multi-GPU training.
v1.4.0.rc04 fixes10 featuresThis release introduces significant enhancements to the CLI, including a new skill installation command (`hf skills add`), improved help output, and new commands for managing papers and collections. It also adds a decentralized evaluation results module to the Hub with corresponding Python helpers.
v1.3.71 fixThis patch release includes a minor bug fix related to logging HTTP error headers.
v1.3.51 featureThis release updates the default httpx timeout to use the HF_HUB_DOWNLOAD_TIMEOUT environment variable, providing flexibility for CI environments experiencing request timeouts.
v1.3.41 fixThis release fixes a minor regression in CommitInfo by setting the default value of _endpoint to None.
v1.3.34 fixes4 featuresThis release introduces the ability to list available hardware for Hugging Face Jobs through both CLI and API, alongside several bug fixes related to streaming performance and file system resolution. The maximum file size limit has also been increased.
v1.3.21 fix1 featureThis release introduces text-to-image support for the zai-org provider and fixes a bug related to endpoint forwarding in CommitUrl.
v1.3.1v1.3.0v1.3.0.rc0Breaking5 fixes8 featuresThis release significantly reorganizes the CLI with dedicated discovery commands (`hf models`, `hf datasets`, `hf spaces`) and introduces real-time job monitoring via `hf jobs stats`. A breaking change removes the unused `direction` parameter from repository listing methods.
v1.2.43 fixes1 featureVersion 1.2.4 fixes several bugs, including an issue with create_repo and job-related API endpoints, and adds the @dataclass_transform decorator to dataclass_with_extra.
v1.2.31 fixThis patch release fixes a bug in the CLI where new repository creation incorrectly defaulted to private=False instead of private=None, aligning CLI behavior with the API.
v1.2.22 fixesThis patch release addresses two specific bugs: one related to reading corrupted metadata files and another concerning the display of the HF_TOKEN message in the authentication list.
v1.2.0Breaking6 fixes8 featuresThis release significantly enhances rate limit handling with automatic retries and clearer errors, introduces the daily papers endpoint, and adds OVHcloud as a new Inference Provider. It also includes a breaking change where several access request listing functions now return iterators instead of lists.
v1.1.71 featureThis release introduces a convenient top-level import for HfFileSystem, now accessible as `huggingface_hub.hffs`.
v1.1.64 fixesThis release focuses on stability, delivering several bug fixes across repository downloading, error handling, and CLI functionality.
v1.1.53 fixes2 featuresThis release introduces OVHcloud AI Endpoints as a new Inference Provider and significantly speeds up CLI installation by integrating `uv` support. Several minor bug fixes address issues in collections, CLI debugging, and inference parsing.
v1.1.4Breaking1 featureVersion 1.1.4 introduces pagination for `list_user_access` results, which required a necessary breaking change to align with the updated server API.
v1.1.31 fix1 featureThis patch release introduces an optional 'name' parameter for catalog deployment and resolves rate-limiting issues encountered during large dataset downloads.
v1.1.02 fixes8 featuresThis release focuses on optimizing the file download experience through multi-threading and cleaner CLI output, while significantly expanding CLI capabilities with new commands for managing Inference Endpoints and verifying cache integrity. New support for WaveSpeedAI and image segmentation on fal are also introduced.
v1.0.1Breaking1 fixThis patch release removes the lingering dependency on `aiohttp` from the `huggingface_hub[inference]` extra and cleans up an unused internal method.
v1.0.0Breaking2 fixes8 featuresHugging Face Hub library v1.0 introduces a major migration to use httpx for HTTP requests, dropping requests/aiohttp dependencies, and completely revamps the CLI experience, replacing `huggingface-cli` with the new `hf` command.
v0.36.05 fixes3 featuresThis release focuses on significant performance optimizations for HfFileSystem and introduces the new get_organization_overview API endpoint, serving as the final minor release before v1.0.0.
v0.35.32 fixesThis release addresses two specific bug fixes related to fal-ai image-to-image inference and Tiny-Agents tool usage.
v0.35.21 fix2 featuresThis release introduces Z.ai as a new official Inference Provider on the Hub and includes an optimization for file system glob operations.
v0.35.12 fixesThis patch release adjusts the retry logic to only target 5xx errors and fixes an issue with unresolved forward references in strict dataclasses.
v0.35.012 fixes8 featuresThis release introduces Scheduled Jobs for running compute tasks on a regular basis via a new CLI interface, alongside significant updates to the Inference Client supporting Image-to-Video and new providers. Several bug fixes and maintenance updates were also applied across the CLI and Jobs API.
v0.34.62 featuresThis release introduces PublicAI as a new supported inference provider, allowing users to interact with models hosted by this nonprofit organization via InferenceClient.
v0.34.51 featureThis release introduces Scaleway as a new supported inference provider, allowing users to access models hosted on their infrastructure via the InferenceClient.
v0.34.41 fix7 featuresThis release introduces support for the Image-To-Video inference task using Fal AI and includes several quality-of-life improvements across job handling, environment dumping, and repository uploads.
v0.34.31 featureThis release updates the uv image for Jobs and improves the whoami command output by adding a 'user:' prefix.
v0.34.22 fixesThis patch release addresses two specific bugs: one related to path extension on Windows and another concerning incorrect total size reporting during download resumption.
v0.34.12 fixesVersion 0.34.1 primarily addresses stability issues in file downloading utilities, fixing potential failures with private tokens and race conditions during concurrent downloads.
v0.34.012 fixes13 featuresThis release introduces 'Jobs', a powerful new CLI and Python API for running compute tasks on Hugging Face infrastructure using Docker-like commands. Additionally, the Hugging Face CLI has been officially renamed from `huggingface-cli` to `hf`, and the InferenceClient gained image-to-image support.
v0.33.51 fixThis patch release addresses a specific UserWarning related to open sessions when using AsyncInferenceClient for streaming operations.
v0.33.41 featureThis release introduces an update to tiny-agent by omitting parameters in its default tools. More details are available in the linked pull request.
v0.33.3This release primarily updates the tiny-agents example and provides a link to the full comparison between versions.
v0.33.2Breaking1 fix1 featureThis release updates the Tiny-Agent integration to use the VSCode MCP format, requiring significant changes to configuration structure, including flattening nested mappings and moving request headers to the root level.
v0.33.13 fixes1 featureThis patch release (v0.33.1) focuses on bug fixes related to payload preparation and inference endpoint health checks, alongside adding tool call support to Tiny agents messages.
v0.33.011 fixes4 featuresThis release introduces two major new inference providers, Featherless.AI and Groq, significantly enhancing model accessibility and inference speed. It also brings several quality-of-life improvements, bug fixes across various components, and advancements for local agent execution via MCP and tiny-agents.
v0.32.61 fixThis patch release fixes a bug related to the incorrect saving of upload_mode and remote_oid parameters.
v0.32.52 featuresThis minor release introduces the ability to inject environment variables into headers and enhances the codebase with better type annotations.
v0.32.43 fixesThis release focuses on bug fixes, specifically addressing issues in `asyncio` usage, token yielding, and the `InferenceClient.question_answering` method.
v0.32.32 fixes1 featureThis release focuses on improvements and bug fixes for the tiny-agents feature, including better environment variable handling and CLI stability.
v0.32.21 fix1 featureVersion 0.32.2 introduces local/remote endpoint inference support and resolves a critical bug affecting snapshot downloads for extremely large repositories.
v0.32.11 fixThis is a patch release focused on fixing a specific reported issue (#3116).
v0.32.01 fix7 featuresThis release introduces powerful new capabilities for LLM interaction via the Model Context Protocol (MCP) Client and Tiny Agents CLI, alongside support for new inference providers and enhanced dataclass validation with the @strict decorator.
v0.31.41 fix2 featuresThis release introduces new `strict` decorators for dataclass validation and adds DTensor support to storage size helpers, alongside several bug fixes.
v0.31.21 featureThis patch release focuses on making the `hf-xet` dependency optional, improving installation flexibility. More context is available in related pull requests.
v0.31.0Breaking8 fixes9 featuresThis release introduces major enhancements to Inference Providers, adding support for LoRA inference via fal.ai and Replicate, and enabling 'auto' provider selection as the new default. Additionally, Xet uploads now support byte arrays, and large file downloads (>50GB) are more reliable.
v0.30.22 fixesThis patch release addresses several bugs related to the InferenceClient, specifically fixing text-generation with external providers and conversational handling in HfInference.
v0.30.11 fixThis patch release addresses a specific bug reported in the Hugging Face Hub repository.
v0.30.0Breaking4 fixes12 featuresThis major release introduces Xet, a new protocol for storing large objects in Git, and significantly enhances the InferenceClient with new providers (Cerebras, Cohere, Novita) and organizational billing support. New features also include wildcard support for CLI uploads and programmatic LFS file management.
v0.29.32 featuresThis release introduces client-side support for the Cerebras and Cohere providers in preparation for their official launch on the Hub.
v0.29.22 fixesThis patch release (v0.29.2) addresses two specific bugs concerning payload model naming when using URLs and ensuring sys.stdout is restored after errors in notebook_login.
v0.29.12 fixesThis patch release (v0.29.1) addresses two specific bugs related to large folder uploads and the inference endpoint waiting mechanism.
v0.29.09 fixes6 featuresThis release significantly expands serverless inference capabilities by adding official support for Fireworks AI, Hyperbolic, Nebius AI Studio, and Novita providers. It also includes several quality-of-life improvements and deprecates some legacy hf-inference specific features.
v0.28.11 fixThis patch release fixes a critical bug introduced in v0.28.0 related to setting the HF_ENDPOINT environment variable with subpaths.
v0.28.0Breaking10 fixes5 featuresThis release introduces major updates to the InferenceClient, enabling unified inference across multiple third-party providers using Hugging Face Hub model IDs. Additionally, HfApi has been updated with new repository properties and the deprecated `like` endpoint has been removed.
Common Errors
LocalEntryNotFoundError4 reportsThe "LocalEntryNotFoundError" in huggingface-hub often means a requested file isn't cached locally. Ensure you have internet connectivity to download the file. If the file should exist, check that you're using the correct `repo_id` and `filename`, and consider clearing your cache with `huggingface-cli delete-cache` if necessary.
FileNotFoundError4 reportsFileNotFoundError in huggingface-hub often arises due to temporary network issues during file downloads or corrupted local cache files. Clearing your local cache (using `huggingface-cli cache purge`) and retrying the download is the primary solution. Implement retry mechanisms in your code to automatically handle intermittent network errors, especially when downloading large files.
BadRequestError3 reportsThe "BadRequestError" in huggingface_hub often arises when the API receives an improperly formatted request, such as incorrect data types or missing required fields in the input. Inspect your request payload (e.g., 'inputs' parameter) to ensure it adheres to the API's expected schema, typically a dictionary with specific key-value pairs. Double-check your code for instances where None or unexpected objects are being passed as input and correct them according to the API documentation before resubmitting the request.
ReadTimeoutError3 reportsReadTimeoutError in huggingface-hub usually arises from network instability or server-side delays when downloading or accessing large files/repositories. To fix this, configure the `HF_HUB_DOWNLOAD_TIMEOUT` environment variable with a higher value (in seconds) to allow more time for the download to complete. Alternatively, if using code directly, increase the `timeout` parameter within relevant functions like `hf_hub_download` to an appropriate value for your network conditions.
HTTPError3 reportsHTTPError in huggingface-hub usually indicates a problem on the Hugging Face Hub's server-side (e.g., server overload, maintenance). Your best course of action is to retry the operation after a short delay (e.g., using exponential backoff) as the server might recover quickly. If the issue persists for an extended period, check the Hugging Face Hub's status page or community forums for updates or report the issue.
RepositoryNotFoundError3 reportsThe "RepositoryNotFoundError" in huggingface_hub usually indicates that the repository you're trying to access (model, dataset, space) does not exist at the specified location, or you lack the necessary permissions. To fix this, double-check the repository name, its owner/organization, and ensure you have the correct authentication token if it's a private repository; verify that the repository actually exists on the Hub. If the problem persists, ensure your machine (such as the runner on a CI/CD) is authorized to access the repository by logging in with `huggingface-cli login` or setting the `HF_TOKEN` environment variable.
Related Dev Tools Packages
Empowering everyone to build reliable and efficient software.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
Playwright is a framework for Web Testing and Automation. It allows testing Chromium, Firefox and WebKit with a single API.
Next generation frontend tooling. It's fast!
An extremely fast Python package and project manager, written in Rust.
A bundler for javascript and friends. Packs many modules into a few bundled assets. Code Splitting allows for loading parts of the application on demand. Through "loaders", modules can be CommonJs, AMD, ES6 modules, CSS, Images, JSON, Coffeescript, LESS, ... and your custom stuff.
Subscribe to Updates
Get notified when new versions are released