v0.9.1
📦 ollama
✨ 7 features🐛 3 fixes🔧 6 symbols
Summary
This release introduces tool calling for DeepSeek-R1 and Magistral, alongside a major preview of native macOS and Windows applications featuring network exposure and custom model directories.
Migration Steps
- Run 'ollama pull' to update models for improved tool calling support.
- Download the new preview apps for macOS or Windows if testing the new application architecture.
- If disabling thinking mode on Magistral, update the system prompt as recommended.
✨ New Features
- Tool calling support added for DeepSeek-R1-2508 and Magistral.
- New preview versions of Ollama for macOS and Windows applications.
- Network exposure: Ollama can now be exposed on the network for remote access.
- Local browser access: Allows websites to access local Ollama installations via JavaScript libraries.
- Custom model directory: Users can now modify the storage path for models (e.g., external disks).
- Native macOS application: Smaller installation footprint and faster startup times.
- Magistral model now supports disabling thinking mode.
🐛 Bug Fixes
- Fixed issue on Windows where 'ollama run' would not start Ollama automatically.
- Improved tool calling reliability for Llama 4 and Mistral.
- Enhanced error messages to be more informative than the generic 'POST predict' error.
🔧 Affected Symbols
ollama runDeepSeek-R1-2508MagistralLlama 4Mistralollama-js