Change8

python-v0.4.5

Breaking Changes
📦 autogenView on GitHub →
1 breaking6 features🐛 1 fixes🔧 9 symbols

Summary

This release introduces token streaming for AgentChat, support for R1-style reasoning 'thought' blocks, and the ability to create FunctionTools from partial functions.

⚠️ Breaking Changes

  • The autogen_magentic_one package has been removed. Users should migrate to the integrated agent implementations within the main autogen packages.

Migration Steps

  1. To enable streaming in AssistantAgent, set model_client_stream=True during initialization.
  2. To handle streaming output manually, listen for the autogen_agentchat.messages.ModelClientStreamingChunkEvent type.
  3. For R1 models, configure the model_info with 'family': ModelFamily.R1 to access the .thought field.
  4. Remove any dependencies on the deprecated autogen_magentic_one package.

✨ New Features

  • Introduced streaming support for AgentChat agents and teams via ModelClientStreamingChunkEvent.
  • Added support for R1-style reasoning output, allowing access to 'thought' content in CreateResult.
  • FunctionTool now supports partial functions (functools.partial), allowing pre-set parameters in tool definitions.
  • Added an optional 'sources' parameter to CodeExecutorAgent.
  • Added support for default_header in model clients.
  • Updated OpenAIAssistantAgent to support AsyncAzureOpenAIChatCompletionClient.

🐛 Bug Fixes

  • Fixed handling of non-string function arguments in tool calls and added corresponding warnings.

🔧 Affected Symbols

autogen_agentchat.agents.AssistantAgentautogen_agentchat.messages.ModelClientStreamingChunkEventautogen_agentchat.ui.Consoleautogen_core.models.CreateResult.thoughtautogen_core.models.ModelFamily.R1autogen_core.tools.FunctionToolautogen_agentchat.agents.CodeExecutorAgentautogen_ext.agents.openai.OpenAIAssistantAgentautogen_magentic_one