Change8

Migrating to llama.cpp b7779

Version b7779 introduces 2 breaking changes. This guide details how to update your code.

Released: 1/19/2026

2
Breaking Changes
2
Migration Steps
2
Affected Symbols

⚠️ Check Your Code

If you use any of these symbols, you need to read this guide:

oai_parser_optserver_chat_params

Breaking Changes

Issue #1

The `oai_parser_opt` refactoring moved functionality into `server_chat_params`. Users relying on the old structure for OpenAI parsing options may need to update their configuration paths.

Issue #2

The chat format fallback to ChatML has been removed. Ensure explicit chat formats are used where necessary.

Migration Steps

  1. 1
    Update any code referencing `oai_parser_opt` to use the new location within `server_chat_params`.
  2. 2
    If relying on an implicit ChatML fallback, explicitly define the required chat format.

Release Summary

This release refactors server configuration by moving OpenAI parsing options to `server_chat_params` and centralizing chat format control in the CLI, while removing the ChatML fallback.

Need More Details?

View the full release notes and all changes for llama.cpp b7779.

View Full Changelog