Change8

v0.14.0

📦 ollamaView on GitHub →
5 features🐛 4 fixes🔧 2 symbols

Summary

This release introduces experimental support for image generation models and enhances API compatibility with Anthropic's message format. It also includes stability improvements for VRAM estimation and introduces the `REQUIRES` command for Modelfiles.

Migration Steps

  1. If you rely on specific memory estimation behavior for older models on low VRAM systems, note that the estimation logic has been adjusted to avoid underflow.
  2. Users defining models that require a specific Ollama version should use the new `REQUIRES` command in their `Modelfile`.

✨ New Features

  • CLI command `ollama run --experimental` now opens a new Ollama CLI featuring an agent loop and the `bash` tool.
  • Added support for Anthropic's `/v1/messages` API.
  • Introduced a new `REQUIRES` command for `Modelfile` to declare required Ollama version.
  • Ollama's app now highlights Swift source code.
  • Added experimental support for image generation models powered by MLX.

🐛 Bug Fixes

  • Prevented an integer underflow during memory estimation on low VRAM systems for older models.
  • Improved VRAM measurement accuracy for AMD iGPUs.
  • An error is now returned when embeddings result in `NaN` or `-Inf`.
  • Ollama's Linux install bundles now use `zst` compression.

Affected Symbols