Change8

b8906

📦 llama-cppView on GitHub →
🐛 1 fixes🔧 1 symbols

Summary

This release addresses a bug in prefix caching when interacting with the Anthropic API by stabilizing checksum handling in system messages. It also provides updated pre-compiled binaries for a wide range of platforms.

🐛 Bug Fixes

  • Fixed prefix caching issue when testing Claude code against llama.cpp by replacing the changing checksum in the x-anthropic-billing-header system message with a fixed value ("fffff") to ensure consistent context handling.

Affected Symbols