v0.30.0
Breaking Changes📦 huggingface-hubView on GitHub →
⚠ 1 breaking✨ 12 features🐛 4 fixes⚡ 1 deprecations🔧 10 symbols
Summary
This major release introduces Xet, a new protocol for storing large objects in Git, and significantly enhances the InferenceClient with new providers (Cerebras, Cohere, Novita) and organizational billing support. New features also include wildcard support for CLI uploads and programmatic LFS file management.
⚠️ Breaking Changes
- The argument `labels` has been removed from `InferenceClient.zero_shot_classification` and `InferenceClient.zero_shot_image_classification` tasks. Use `candidate_labels` instead.
Migration Steps
- Install the optional dependency to enable Xet support: `pip install -U huggingface_hub[hf_xet]`.
- When using `InferenceClient.zero_shot_classification` or `InferenceClient.zero_shot_image_classification`, replace usage of the `labels` argument with `candidate_labels`.
- If you were explicitly setting `token=False` in `InferenceClient`, remove that argument.
✨ New Features
- Introduction of Xet protocol support for storing large objects in Git repositories, designed to replace Git LFS.
- Ability to download files from Xet-enabled repositories via the optional dependency `huggingface_hub[hf_xet]`.
- Cerebras and Cohere added as official inference providers to `InferenceClient`.
- Novita added as a text-to-video inference provider supporting asynchronous calls.
- Support for centralizing billing on organizations for `InferenceClient` using the `bill_to` parameter (requires Enterprise Hub subscription).
- Asynchronous calls introduced for text-to-video inference in `InferenceClient` to handle long-running tasks.
- Support for passing a path with a wildcard to `huggingface-cli upload` (e.g., `huggingface-cli upload my-cool-model *.safetensors`).
- Support for deploying an Inference Endpoint directly from the Model Catalog using `create_inference_endpoint_from_catalog`.
- ModelHubMixin update: authors can now provide a paper URL to be added to all pushed model cards.
- ModelHubMixin update: dataclasses are now supported for any init argument (previously only `config`).
- Added `--sort` argument to `huggingface-cli delete-cache` to allow sorting by name, size, last updated, or last used.
- Support for programmatically listing and permanently deleting LFS files from a repository via `HfApi.list_lfs_files` and `HfApi.permanently_delete_lfs_files`.
🐛 Bug Fixes
- Fixed a revision bug in `_upload_large_folder.py`.
- Fixed the inference endpoint wait function for proper waiting during updates.
- Updated `SpaceHardware` enum.
- Restored `sys.stdout`.
Affected Symbols
⚡ Deprecations
- The `token=False` argument is no longer accepted in `InferenceClient`.