v4.56.1-Vault-Gemma-preview
📦 transformersView on GitHub →
✨ 3 features🔧 3 symbols
Summary
This release introduces a preview of the Vault-Gemma model, a 1B parameter decoder-only model trained with sequence-level differential privacy.
Migration Steps
- Install the preview version using: pip install git+https://github.com/huggingface/transformers@v4.56.1-Vault-Gemma-preview
✨ New Features
- Added support for Vault-Gemma, a text-only decoder model derived from Gemma 2.
- Vault-Gemma features a 1B parameter architecture trained with sequence-level differential privacy (DP).
- Vault-Gemma architecture modifications: removed norms after Attention and MLP blocks, and uses full attention for all layers.
🔧 Affected Symbols
VaultGemmaAutoModelForCausalLMpipeline