v0.17.0
📦 peftView on GitHub →
✨ 4 features🐛 10 fixes⚡ 1 deprecations🔧 6 symbols
Summary
This release introduces two major new PEFT methods, SHiRA and MiSS (which deprecates Bone), and significantly enhances LoRA by enabling direct targeting of nn.Parameter, crucial for MoE layers. It also adds utility for injecting adapters directly from a state_dict.
Migration Steps
- If you are using the Bone method, convert your checkpoints to MiSS using [`scripts/convert-bone-to-miss.py`](https://github.com/huggingface/peft/tree/main/scripts/convert-bone-to-miss.py) to prepare for PEFT v0.19.0.
- When targeting MoE weights in LoRA, use the `target_parameters` attribute in `LoraConfig` and set `target_modules` to `[]` if necessary, e.g., `target_parameters=["feed_forward.experts.down_proj", "feed_forward.experts.gate_up_proj"]`.
- If you need to inject an adapter from a state_dict without knowing the full config, use `inject_adapter_in_model(lora_config, model, state_dict=state_dict)`.
✨ New Features
- Introduction of SHiRA (Sparse High Rank Adapters) method, potentially offering performance gains over LoRAs, especially regarding concept loss with multiple adapters.
- Introduction of MiSS (Matrix Shard Sharing) method, an evolution of Bone, showing excellent performance and memory efficiency.
- LoRA can now target nn.Parameter directly using the `target_parameters` config attribute, useful for MoE layers.
- New capability to inject PEFT layers using `inject_adapter_in_model` by passing the loaded state_dict, avoiding the need to manually specify PEFT config details like `target_modules`.
🐛 Bug Fixes
- Corrected a bug in prompt learning methods where `modules_to_save` was ignored, causing classification layers to be neither trained nor stored.
- Fixed an issue where the mask function signature change in transformers 4.53.1 caused issues.
- Fixed faulty OFT parameter device test.
- Fixed an error in the workflow file to deploy the method comparison app.
- Fixed Prefix tuning behavior after transformers PR 38635.
- Fixed failing `target_parameters` param usage count.
- Fixed trainable tokens issue when using FSDP.
- Fixed small issues related to `target_parameters` usage.
- Fixed missing device map for facebook/opt-125m.
- Fixed issue where regex-targeted embedding layer was not being detected.
🔧 Affected Symbols
nn.ParameterLoraConfiginject_adapter_in_modelBoneMiSSprompt learning methods⚡ Deprecations
- Bone method is deprecated in favor of MiSS and will be removed in PEFT v0.19.0.