v3.16.0
📦 datadog-sdkView on GitHub →
✨ 3 features🐛 8 fixes⚡ 1 deprecations🔧 5 symbols
Summary
This release introduces Python 3.14 compatibility updates and new features for OpenTelemetry metrics and Ray tracing, alongside several bug fixes, particularly for LLM Observability and CI Visibility.
Migration Steps
- If using Ray, enable tracing by starting the Ray head with --tracing-startup-hook=ddtrace.contrib.ray:setup_tracing.
✨ New Features
- opentelemetry: Adds default configurations for the OpenTelemetry Metrics API implementation to improve the Datadog user experience, setting OTEL_EXPORTER_OTLP_METRICS_ENDPOINT, OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE, OTEL_METRIC_EXPORT_INTERVAL, and OTEL_METRIC_EXPORT_TIMEOUT.
- LLM Observability: MCP integration also traces ClientSession contexts, ClientSession.initialize, and ClientSession.list_tools.
- ray: Introduces a Ray core integration that traces Ray jobs, remote tasks, and actor method calls (Supported for Ray >= 2.46.0). Enable tracing via --tracing-startup-hook=ddtrace.contrib.ray:setup_tracing.
🐛 Bug Fixes
- AAP: Resolves an issue where stream endpoints with daphne/django where unresponsive due to an asyncio error.
- CI Visibility: Fixes an issue where code imported at module level but not executed during a test would not be considered by Test Impact Analysis as impacting the test; code executed at import time is now included.
- google-adk: Fixes an AttributeError that could occur when tracing Google ADK agent runs, due to the agent model attribute not being defined for SequentialAgent class.
- opentelemetry: Fixes the parsing of OTLP metrics exporter configurations and the operation to automatically append the v1/metrics path to HTTP OTLP endpoints.
- langchain: Resolves an issue where langchain patching would throw an ImportError for when using langchain_core>=0.3.76.
- LLM Observability: Ensures APM is disabled when DD_APM_TRACING_ENABLED=0 when using LLM Observability.
- LLM Observability: Resolves an issue where model IDs were not being parsed correctly if the model ID was an inference profile ID in the bedrock integration.
- LLM Observability: Enables the backend to differentiate AI Obs spans from other DJM spans, preventing AI Observability spans from being billed as part of the APM bill.
🔧 Affected Symbols
ClientSessionClientSession.initializeClientSession.list_toolsSequentialAgentlangchain_core⚡ Deprecations
- vertica: The vertica integration is deprecated and will be removed in a future version, around the same time that ddtrace drops support for Python 3.9.