Change8

v0.15.5-rc0

📦 ollamaView on GitHub →
3 features

Summary

This release introduces the GLM-OCR model and updates default context sizes based on VRAM availability. It also adds support for GLM-4.7-Flash on the MLX engine.

✨ New Features

  • Introduced GLM-OCR, a multimodal OCR model for complex document understanding.
  • Added GLM-4.7-Flash support when using Ollama's experimental MLX engine.
  • Updated default context size based on available VRAM: 4096 for < 24 GiB, 32768 for 24-48 GiB, and 262144 for >= 48 GiB.