v3.16.0rc2
📦 datadog-sdk
✨ 3 features🐛 8 fixes⚡ 1 deprecations🔧 5 symbols
Summary
This release introduces Python 3.14 compatibility for core library injection and adds features like Ray tracing integration and OpenTelemetry metrics defaults. Several key integrations remain incompatible with Python 3.14 pending upstream dependency updates.
Migration Steps
- If using Ray, enable tracing by starting the Ray head with --tracing-startup-hook=ddtrace.contrib.ray:setup_tracing.
✨ New Features
- Adds default configurations for the OpenTelemetry Metrics API implementation to improve the Datadog user experience, including setting OTEL_EXPORTER_OTLP_METRICS_ENDPOINT, OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE, OTEL_METRIC_EXPORT_INTERVAL, and OTEL_METRIC_EXPORT_TIMEOUT.
- LLM Observability now traces ClientSession contexts, ClientSession.initialize, and ClientSession.list_tools in MCP integration.
- Introduces a Ray core integration that traces Ray jobs, remote tasks, and actor method calls (supported for Ray >= 2.46.0). To enable tracing, start the Ray head with --tracing-startup-hook=ddtrace.contrib.ray:setup_tracing.
🐛 Bug Fixes
- Resolved an issue where stream endpoints with daphne/django were unresponsive due to an asyncio error (AAP).
- Fixed an issue where code imported at module level but not executed during a test would not be considered by Test Impact Analysis as impacting the test; code executed at import time is now included among impacted lines (CI Visibility).
- Fixed an AttributeError that could occur when tracing Google ADK agent runs due to the agent model attribute not being defined for SequentialAgent class.
- Fixed the parsing of OTLP metrics exporter configurations and the operation to automatically append the v1/metrics path to HTTP OTLP endpoints (opentelemetry).
- Resolved an issue where langchain patching would throw an ImportError when using langchain_core>=0.3.76.
- Ensures APM is disabled when DD_APM_TRACING_ENABLED=0 when using LLM Observability.
- Resolved an issue where model IDs were not being parsed correctly if the model ID was an inference profile ID in the bedrock integration (LLM Observability).
- Enabled the backend to differentiate AI Obs spans from other DJM spans to prevent billing AI Observability spans as part of the APM bill (LLM Observability).
🔧 Affected Symbols
ClientSessionClientSession.initializeClientSession.list_toolsSequentialAgentlangchain_core⚡ Deprecations
- The vertica integration is deprecated and will be removed in a future version, around the same time that ddtrace drops support for Python 3.9.