Change8

v3.14.0

📦 datadog-sdkView on GitHub →
4 features🐛 10 fixes🔧 4 symbols

Summary

This release introduces AI Guard support for LangChain, makes `ml_app` optional for LLM Observability, and includes numerous bug fixes across AAP, Django, asyncpg, exception replay, and CI Visibility.

Migration Steps

  1. For LLM Observability, if you were relying on `ml_app` being mandatory, note that it is now optional and defaults to `service`. It is still recommended to set it explicitly if applicable.

✨ New Features

  • Added AI Guard evaluations support to LangChain instrumentation.
  • Added evaluation support to streaming LangChain APIs.
  • In LLM Observability, `ml_app` is now optional and defaults to `service`; enabling LLM Observability will no longer throw if one is not provided or propagated from an upstream service.
  • Introduced a `tool_definitions` parameter to the `LLMObs.annotate()` method for tool calling scenarios, allowing users to pass a list of tool definition dictionaries directly to annotate LLM spans with available tools. Each definition must include a `name` field, with optional `description` and `schema` fields.

🐛 Bug Fixes

  • Ensured the status code for downstream requests is properly sent to libddwaf (AAP).
  • Resolved an incompatibility with gevent>=25.8.1 that caused a deadlock when starting the waf via remote config (AAP).
  • Fixed an issue causing `ValueError: coroutine already executing` on Python 3.13+ with `django.utils.decorators.async_only_middleware`.
  • Fixed an error in asyncpg related to using custom connect options, ensuring postgres.connect spans are created when this option is used.
  • Fixed an issue in exception replay that prevented snapshots from retrieving local variables from traceback frames of exceptions thrown by Celery tasks.
  • Resolved an issue in LLM Observability where decorated functions returning responses with ambiguous truth values (e.g., pandas dataframes) would raise an error due to boolean coercion failure.
  • Fixed an issue in LLM Observability where certain Google GenAI LLM requests were not being traced due to importing from google.genai.types on startup.
  • Fixed an issue that could cause some products to fail to start properly in applications using `pkg_resources`, directly or indirectly.
  • Upgraded echion in profiling to pick up critical bug fixes and performance improvements.
  • Resolved an issue in CI Visibility where coverage from sessions using pytest-xdist was not submitted with the proper session ID, impacting Test Impact Analysis.

🔧 Affected Symbols

LLMObs.annotateasyncpg.connectdjango.utils.decorators.async_only_middlewaregoogle.genai.types