v4.53.2-Ernie-4.5-preview
📦 transformersView on GitHub →
✨ 2 features🔧 3 symbols
Summary
This preview release introduces Baidu's Ernie 4.5 model family to Transformers, including a 0.3B dense model and MoE variants (21B and 300B).
Migration Steps
- Install the preview version using: pip install git+https://github.com/huggingface/transformers@v4.53.2-Ernie-4.5-preview
✨ New Features
- Added support for Ernie 4.5 dense model (0.3B parameters) based on Llama architecture.
- Added support for Ernie 4.5 MoE models (21B and 300B variants) featuring Mixtral-based architecture with additional shared experts.
🔧 Affected Symbols
AutoModelForCausalLMAutoTokenizerErnie45ForCausalLM