Change8

v0.14.0-rc7

📦 ollamaView on GitHub →
5 features🐛 4 fixes🔧 1 symbols

Summary

This release introduces experimental support for image generation models via MLX, adds Anthropic API compatibility, and includes several stability improvements related to VRAM handling and error reporting.

✨ New Features

  • CLI command `ollama run --experimental` now opens a new Ollama CLI featuring an agent loop and the `bash` tool.
  • Added support for Anthropic API's `/v1/messages` endpoint.
  • Introduced a new `REQUIRES` command in `Modelfile` to specify required Ollama version for a model.
  • Ollama's app now highlights Swift source code.
  • Added experimental support for image generation models powered by MLX.

🐛 Bug Fixes

  • Prevented integer underflow on low VRAM systems during memory estimation for older models.
  • Improved VRAM measurement accuracy for AMD iGPUs.
  • An error is now returned if embeddings result in `NaN` or `-Inf`.
  • Ollama's Linux install bundles now use `zst` compression.

Affected Symbols