Change8

v5.0.0

📦 sentence-transformers
8 features🐛 1 fixes🔧 10 symbols

Summary

Sentence-Transformers 5.0.0 adds SparseEncoder support, new encode_query/document methods, multiprocessing encoding, a Router module, custom learning rates, and composite loss logging, while remaining backwards compatible.

Migration Steps

  1. Read the Migration Guide at https://sbert.net/docs/migration_guide.html for recommended changes.
  2. Upgrade the package using pip install sentence-transformers==5.0.0 (or appropriate extras).
  3. Replace any custom encoding logic with the new encode_query / encode_document methods if needed.
  4. If using custom training loops, adjust parameter group definitions to use the new custom learning rate support.
  5. Test your code after upgrade; open an issue if you encounter problems.

✨ New Features

  • Introduced SparseEncoder class for sparse embedding models.
  • Added encode_query and encode_document methods for query/document specific encoding.
  • Enabled multi-processing support in the encode method.
  • Added Router module for asymmetric model architectures.
  • Supported custom learning rates for parameter groups during training.
  • Implemented composite loss logging.
  • Added decode, intersection, and sparsity utilities for sparse embeddings.
  • Various small improvements and bug fixes.

🐛 Bug Fixes

  • Various small bug fixes and improvements.

🔧 Affected Symbols

SparseEncoderencode_queryencode_documentencodeRoutercustom learning rate parameter groupscomposite loss loggingdecodeintersectionsparsity