Change8

v3.11.0rc2

📦 datadog-sdkView on GitHub →
13 features🐛 26 fixes1 deprecations🔧 15 symbols

Summary

This release introduces extensive LLM Observability features, including token caching tracking for Anthropic and Bedrock, and automatic tracing for Google GenAI calls. It also fixes numerous bugs across CI Visibility, LLM tracing, and core tracing functionality, while deprecating the `freezegun` integration.

Migration Steps

  1. If you rely on the `freezegun` integration for CI Visibility, note that it is deprecated and no longer required; remove any explicit configuration related to it.
  2. If you were using non-structured logging and relied on `ddtrace` to set the format string, ensure your logging setup is compatible now that trace attribute injection is enabled by default (or explicitly set `DD_LOGS_INJECTION=False` if you need to disable injection).

✨ New Features

  • LLM Observability: Added support to `submit_evaluation_for()` for submitting boolean metrics in LLMObs evaluation metrics, using `metric_type="boolean"`.
  • LLM Observability: Bedrock Converse `toolResult` content blocks are formatted as tool messages on LLM Observability `llm` spans' inputs.
  • LLM Observability: Introduces capturing the number of input tokens read and written to the cache for Anthropic prompt caching use cases.
  • LLM Observability: Introduces the ability to track the number of tokens read and written to the cache for Bedrock Converse prompt caching.
  • LLM Observability: Adds support to automatically submit Google GenAI calls to LLM Observability.
  • LLM Observability: Introduces tracking cached input token counts for OpenAI chats/responses prompt caching.
  • LLM Observability: Adds support to automatically submit PydanticAI request spans to LLM Observability.
  • MCP SDK: Adds tracing support for `mcp.client.session.ClientSession.call_tool` and `mcp.server.fastmcp.tools.tool_manager.ToolManager.call_tool` methods.
  • OTEL: Adds experimental support for exporting OTLP metrics via the OpenTelemetry Metrics API (requires setting DD_METRICS_OTEL_ENABLED=true and including an OTLP metrics exporter).
  • ASGI: Obfuscate resource names on 404 spans when `DD_ASGI_OBFUSCATE_404_RESOURCE` is enabled.
  • Code Origin: Added support for in-product enablement.
  • Logging: Automatic injection of trace attributes into logs is now enabled for the standard logging library when using either `ddtrace-run` or `import ddtrace.auto` (can be disabled via DD_LOGS_INJECTION=False).
  • Google GenAI: Adds support for APM/LLM Observability tracing for Google GenAI's `embed_content` methods.

🐛 Bug Fixes

  • CI Visibility: Resolved an issue where `freezegun` would not work with tests defined in `unittest` classes.
  • CI Visibility: Fixed issue where Test Optimization together with external retry plugins (e.g., `flaky`, `pytest-rerunfailures`) caused incorrect test result reporting; advanced features are disabled when using these plugins.
  • CI Visibility: Resolved an issue where setting custom loggers during a test session caused the tracer to emit logs to a closed stream handler, leading to `I/O operation on closed file` errors.
  • CI Visibility: Fixed issue where test retry numbers were not reported correctly when tests were run with `pytest-xdist`.
  • AAP: Fixed FastAPI body extraction failure in async contexts for large bodies by setting request body chunk read timeout to 0.1 seconds (configurable via DD_FASTAPI_ASYNC_BODY_TIMEOUT_SECONDS).
  • LiteLLM: Fixed issue where potentially sensitive parameters were tagged as metadata on LLM Observability spans; metadata tags are now based on an allowlist instead of a denylist.
  • Lib Injection: Fixed bug preventing the Single Step Instrumentation (SSI) telemetry forwarder from completing when debug logging was enabled.
  • LLM Observability: Addressed upstream issue in Anthropic prompt caching where input tokens were reported incorrectly; now correctly counts input tokens including cached read/write prompt tokens.
  • LLM Observability (OpenAI): Resolved `AttributeError` while parsing `NoneType` streamed chunk deltas during openai tracing.
  • LLM Observability (OpenAI): Fixed `AttributeError` when parsing token metrics for streamed reasoning responses from the responses api.
  • LLM Observability: Addressed upstream issue in Bedrock prompt caching where input tokens were reported incorrectly; now correctly counts input tokens including cached read/write prompt tokens.
  • LLM Observability: Fixed issue where input messages for tool messages were not being captured properly.
  • LLM Observability: Resolved index error when incomplete streamed responses from OpenAI responses API occurred with LLM Observability tracing enabled.
  • LLM Observability: Fixed broken LangGraph span links for execution flows for `langgraph>=0.3.22`.
  • LLM Observability: Resolved issue where tool choice input messages for OpenAI Chat Completions were not captured.
  • LLM Observability: Fixed `ValueError` raised when grabbing token values for some providers through `langchain` libraries.
  • LLM Observability: Resolved error when passing back tool call results to OpenAI Chat Completions with LLM Observability tracing enabled.
  • Dynamic Instrumentation: Improved support for function probes interacting with the Python garbage collector (e.g., synapse).
  • Logging: Fixed issue where `dd.*` properties were not injected onto logging records unless `DD_LOGS_ENABLED=true` was set, affecting non-structured loggers.
  • Azure Functions: Resolved exception when instrumenting a function that consumes a list of service bus messages.
  • Profiling: Fixed issue with greenlet support that could cause greenlet spawning to fail in rare cases.
  • Profiling: Fixed bug where profile frames from the package specified by DD_MAIN_PACKAGE were incorrectly marked as "library" code in the profiler UI.
  • Tracing: Resolved issue where programmatically set span service names were not reported to Remote Configuration.
  • Tracing: Fixed issue where `@tracer.wrap()` decorator failed to preserve the decorated function's return type, returning `Any` instead of the original type.
  • Tracing: Resolved issue where spans had incorrect timestamps and durations when `freezegun` was in use (integration is no longer needed).
  • Tracing: Fixed issue where span durations or start timestamps exceeding the platform's `LONG_MAX` caused traces to fail to send.

🔧 Affected Symbols

submit_evaluation_forddtrace.autoddtrace-runmcp.client.session.ClientSession.call_toolmcp.server.fastmcp.tools.tool_manager.ToolManager.call_toolDD_ASGI_OBFUSCATE_404_RESOURCEDD_LOGS_INJECTIONembed_contentfreezegunflakypytest-rerunfailurespytest-xdistDD_FASTAPI_ASYNC_BODY_TIMEOUT_SECONDSDD_MAIN_PACKAGE@tracer.wrap()

⚡ Deprecations

  • CI Visibility: The `freezegun` integration is deprecated and will be removed in 4.0.0. The `freezegun` integration is not necessary anymore for the correct reporting of test durations and timestamps.