Error4 reports
Fix OutOfMemoryError
in sentence-transformers
✅ Solution
OutOfMemoryError in sentence-transformers usually arises from loading excessively large models or batches onto the GPU. Reduce the batch size during training or inference, and consider using a smaller model like `all-MiniLM-L6-v2` which has a lower memory footprint. Alternatively, enable gradient accumulation or offload model weights to CPU during training if possible.
Related Issues
Real GitHub issues where developers encountered this error:
GPU Memory Issues when training sentence-transformer model on CoIR datasetMay 10, 2025
HPO torch.OutOfMemoryError despite large GPU VRAM (model seems to load model 3 times to gpu)Feb 4, 2025
Hitting OOM in model loading in multi GPU settingJan 23, 2025
SentenceTransformerTrainer consuming unexpectedly large amount of memoryJan 23, 2025
Timeline
First reported:Jan 23, 2025
Last reported:May 10, 2025
Need More Help?
View the full changelog and migration guides for sentence-transformers
View sentence-transformers Changelog