v3.11.0rc3
📦 datadog-sdkView on GitHub →
✨ 16 features🐛 23 fixes⚡ 1 deprecations🔧 27 symbols
Summary
This release introduces significant new features across LLM Observability, AAP endpoint discovery, and experimental OTLP metrics export. It also includes numerous bug fixes, particularly addressing tracing and token counting issues in LLM integrations, and deprecates the `freezegun` integration for CI Visibility.
Migration Steps
- If you rely on the deprecated `freezegun` integration for CI Visibility, note that it will be removed in 4.0.0. No action is immediately required unless you are planning to upgrade to 4.0.0 soon.
- If you encounter issues with FastAPI async body extraction for large bodies, consider setting the `DD_FASTAPI_ASYNC_BODY_TIMEOUT_SECONDS` environment variable.
- To enable experimental OTLP metrics export, set the environment variable `DD_METRICS_OTEL_ENABLED` to `true` and ensure your application includes an OTLP metrics exporter.
- To enable obfuscation of resource names on 404 spans for ASGI, set the environment variable `DD_ASGI_OBFUSCATE_404_RESOURCE` to `true`.
- To disable automatic trace attribute injection into logs, set the environment variable `DD_LOGS_INJECTION` to `False`.
✨ New Features
- AAP: Introduces endpoint discovery for Django applications, allowing collection of API endpoints at startup.
- aws: Set peer.service explicitly to improve the accuracy of serverless service representation by removing the default `base_service` value of "runtime" in Lambda spans.
- LLM Observability: Added support to `submit_evaluation_for()` for submitting boolean metrics using `metric_type="boolean"` for binary evaluation results.
- LLM Observability: Introduces tagging agent-specific metadata on agent spans when using CrewAI, OpenAI Agents, or PydanticAI.
- LLM Observability: Bedrock Converse `toolResult` content blocks are now formatted as tool messages on LLM Observability `llm` spans' inputs.
- LLM Observability: Capturing the number of input tokens read and written to the cache for Anthropic prompt caching use cases.
- LLM Observability: Ability to track the number of tokens read and written to the cache for Bedrock Converse prompt caching.
- LLM Observability: Adds support to automatically submit Google GenAI calls to LLM Observability.
- LLM Observability: Introduces tracking cached input token counts for OpenAI chats/responses prompt caching.
- LLM Observability: Adds support to automatically submit PydanticAI request spans to LLM Observability.
- mcp: Adds tracing support for `mcp.client.session.ClientSession.call_tool` and `mcp.server.fastmcp.tools.tool_manager.ToolManager.call_tool` methods in the MCP SDK.
- otel: Adds experimental support for exporting OTLP metrics via the OpenTelemetry Metrics API (requires setting DD_METRICS_OTEL_ENABLED=true).
- asgi: Obfuscate resource names on 404 spans when `DD_ASGI_OBFUSCATE_404_RESOURCE` is enabled.
- code origin: added support for in-product enablement.
- logging: Automatic injection of trace attributes into logs is now enabled for the standard logging library when using `ddtrace-run` or `import ddtrace.auto` (can be disabled via DD_LOGS_INJECTION=False).
- google_genai: Adds support for APM/LLM Observability tracing for Google GenAI's `embed_content` methods.
🐛 Bug Fixes
- CI Visibility: Resolved an issue where `freezegun` would not work with tests defined in `unittest` classes.
- CI Visibility: Fixed issue where using Test Optimization with external retry plugins (like `flaky` or `pytest-rerunfailures`) caused incorrect test result reporting (Test Optimization advanced features are disabled when using these plugins).
- CI Visibility: Resolved an issue causing `I/O operation on closed file` errors at test session end when setting custom loggers due to `pytest` closing handlers.
- CI Visibility: Fixed incorrect reporting of test retry numbers when tests were run with `pytest-xdist`.
- AAP: Fixed FastAPI body extraction failure in async contexts for large bodies by setting request body chunk read timeout to 0.1s (configurable via DD_FASTAPI_ASYNC_BODY_TIMEOUT_SECONDS).
- litellm: Resolved issue where potentially sensitive parameters were tagged as metadata on LLM Observability spans; metadata tags are now based on an allowlist instead of a denylist.
- lib-injection: Fixed a bug preventing the Single Step Instrumentation (SSI) telemetry forwarder from completing when debug logging was enabled.
- LLM Observability: Addresses upstream issue in Anthropic prompt caching where input tokens were reported incorrectly; now correctly counts input tokens including cached read/write prompt tokens.
- LLM Observability (openai): Resolved `AttributeError` while parsing `NoneType` streamed chunk deltas during openai tracing.
- LLM Observability (openai): Fixed `AttributeError` when parsing token metrics for streamed reasoning responses from the responses api.
- LLM Observability: Addresses upstream issue in Bedrock prompt caching where input tokens were reported incorrectly; now correctly counts input tokens including cached read/write prompt tokens.
- LLM Observability: Fixed issue where input messages for tool messages were not being captured properly.
- LLM Observability: Resolved `IndexError` when incomplete streamed responses were returned from OpenAI responses API with LLM Observability tracing enabled.
- LLM Observability: Fixed broken LangGraph span links for execution flows for `langgraph>=0.3.22`.
- LLM Observability: Resolved issue where tool call input messages for OpenAI Chat Completions were not captured.
- LLM Observability: Fixed `ValueError` raised when grabbing token values for some providers through `langchain` libraries.
- LLM Observability: Resolved error when passing back tool call results to OpenAI Chat Completions with LLM Observability tracing enabled.
- dynamic instrumentation: Improved support for function probes with frameworks interacting with the Python garbage collector (e.g., synapse).
- logging: Fixed issue where `dd.*` properties were not injected onto logging records unless `DD_LOGS_ENABLED=true` was set, affecting non-structured loggers.
- azure_functions: Resolved an issue causing an exception when instrumenting a function that consumes a list of service bus messages.
- profiling: Fixed an issue with greenlet support that could cause greenlet spawning to fail in rare cases.
- profiling: Fixed a bug where profile frames from the package specified by DD_MAIN_PACKAGE were incorrectly marked as "library" code in the profiler UI.
- tracing: Resolved issue where programmatically set span service names were not reported to Remote Configuration.
🔧 Affected Symbols
freezegunpytestcoverage.pyflakypytest-rerunfailurespytest-xdistFastAPIAnthropicCrewAIOpenAI AgentsPydanticAIBedrock Conversemcp.client.session.ClientSession.call_toolmcp.server.fastmcp.tools.tool_manager.ToolManager.call_toolOpenTelemetry Metrics APIasgiddtrace.autogoogle_genai.embed_contentlitellmlib-injectionLangGraphopenailangchainsynapseazure_functionsgreenletDD_MAIN_PACKAGE⚡ Deprecations
- CI Visibility: The `freezegun` integration is deprecated and will be removed in 4.0.0. The `freezegun` integration is not necessary anymore for the correct reporting of test durations and timestamps.