Change8

v0.14.0-rc10

📦 ollamaView on GitHub →
5 features🐛 3 fixes🔧 2 symbols

Summary

This release introduces experimental support for image generation models via MLX and enhances Anthropic API compatibility. It also adds the `REQUIRES` command to Modelfiles for version declaration and improves VRAM estimation accuracy.

Migration Steps

  1. If you define models using `Modelfile`, consider using the new `REQUIRES` command to specify the minimum required Ollama version.

✨ New Features

  • Added experimental support for image generation models powered by MLX.
  • The `ollama run --experimental` CLI now opens a new Ollama CLI featuring an agent loop and the `bash` tool.
  • Added Anthropic API compatibility for the `/v1/messages` API.
  • Introduced a new `REQUIRES` command in `Modelfile` to declare the required Ollama version for a model.
  • Ollama's app now highlights Swift source code.

🐛 Bug Fixes

  • Fixed an issue where Ollama would experience an integer underflow during memory estimation on low VRAM systems when running older models.
  • Improved VRAM measurement accuracy for AMD iGPUs.
  • An error is now returned when embeddings result in `NaN` or `-Inf`.

Affected Symbols