v3.12.0
π¦ datadog-sdkView on GitHub β
β¨ 10 featuresπ 17 fixesβ‘ 4 deprecationsπ§ 21 symbols
Summary
This release introduces the technical preview of the AI Guard Python SDK, adds experimental CI Visibility writer separation, and enhances LLM Observability tracing across several integrations. Several deprecations are noted, including the removal of Windows profiling support and the deprecation of `ddtrace.settings.__init__` imports.
Migration Steps
- If you were using the deprecated `non_active_span` parameter in `HTTPPropagator.inject`, update calls to use the `context` parameter instead: `HTTPPropagator.inject(context=...)`.
- If you were relying on `DD_PROFILING_MAX_EVENTS`, switch to using `DD_PROFILING_HEAP_SAMPLE_SIZE` for controlling memory profiler sampling frequency.
- If you are using ASGI and want tracing on websocket messages, ensure `DD_TRACE_WEBSOCKET_MESSAGES_ENABLED` is set (it replaces the old variable).
β¨ New Features
- App and API Protection (AAP): Introduce a public Python SDK that provides programmatic access to AI Guardβs public endpoint (technical preview).
- asgi: Adds tracing on websocket spans with `DD_TRACE_WEBSOCKET_MESSAGES_ENABLED`, which replaces `DD_TRACE_WEBSOCKET_MESSAGES`.
- CI Visibility: Introduces an alternative method for collecting and sending test spans, keeping the `CIVisibility` tracer separate from the global `ddtrace` tracer (experimental, enable with `DD_CIVISIBILITY_USE_BETA_WRITER=true`).
- crewai: Introduces APM and LLM Observability tracing support for CrewAI Flow `kickoff/kickoff_async` calls, including tracing internal flow method execution.
- LLM Observability: Adds support for collecting tool definitions, tool calls and tool results in the Anthropic integration.
- LLM Observability: Increases span event size limit from 1MB to 5MB.
- LLM Observability: Records agent manifest information for LangGraph compiled graphs.
- LLM Observability: Adds ability to drop spans by having a `SpanProcessor` return `None`.
- LLM Observability (mcp): Adds distributed tracing support for MCP tool calls across client-server boundaries by default.
- Remote Config: Eagerly query Remote Config upon process startup to ensure timely configuration updates.
π Bug Fixes
- AAP: Resolves a bug where ASGI middleware would not catch the BlockingException raised by AAP because it was aggregated in an ExceptionGroup.
- AAP: This fix resolves an issue where a malformed package would prevent reporting of other correctly formed packages to Software Composition Analysis.
- AAP: This fix resolves an issue where the `route` parameter was not being correctly handled in the Django path function.
- CI Visibility: This fix resolves an issue where using the pytest `skipif` marker with the condition passed as a keyword argument (or not provided at all) would cause the test to be reported as failed, especially when `flaky` or `pytest-rerunfailures` were used.
- ddtrace_api: Fixes a bug in the ddtrace_api integration in which `patch()` with no arguments, and thus `patch_all()`, breaks the integration.
- django: fix incorrect component tag being set for django orm spans.
- dynamic instrumentation: extended captured value redaction in mappings with keys of type `bytes`.
- openai: Resolves an issue where an uninitialized `OpenAI/AsyncOpenAI` client would result in an `AttributeError`.
- pydantic_ai: This fix resolves an issue where enabling the Pydantic AI for `pydantic-ai-slim >= 0.4.4` would fail.
- tracing: Resolves an issue where sampling rules with null values for service, resource, or name would not match any spans; null and unset fields are now treated the same.
- tracing: Fix inconsistent trace sampling during partial flush (traces >300 spans) by correctly applying sampling rules to the root span.
- kafka: This fix resolves an issue where the `list_topics` call in the Kafka integration could hang indefinitely by setting a 1-second timeout and caching results/failures.
- Code Security (IAST): Fixes Gevent worker timeouts by preloading IAST early and refactoring taint sink initialization.
- LLM Observability: Fixes a bug where code execution outputs done through `google-genai` would result in no output messages on the LLM Observability `llm` span.
- LLM Observability (langgraph): resolves `ModuleNotFoundError` errors when patching `langgraph>=0.6.0`.
- LLM Observability (openai): fixed an issue when using the openai responses api with `openai>=1.66.0,<1.66.2` would result in an `AttributeError`.
- Flares: Fixes to make the tracer flares match the spec.
π§ Affected Symbols
ddtrace.settings.__init__HTTPPropagator.injectCIVisibilityddtracekickoff/kickoff_asyncAnthropic integrationLangGraphSpanProcessorMCP tool callsASGI middlewareDjango path functionddtrace_api.patch()ddtrace_api.patch_all()django orm spansKafka list_topicsIASTGevent workergoogle-genailanggraphopenai responses apiOpenAI/AsyncOpenAI clientβ‘ Deprecations
- tracing: `ddtrace.settings.__init__` imports are deprecated and will be removed in version 4.0.0.
- tracing: Deprecate the non_active_span parameter in the `HTTPPropagator.inject` method. Use `HTTPPropagator.inject(context=...)` instead.
- profiling: Windows support is removed.
- profiling: ENV `DD_PROFILING_MAX_EVENTS` is deprecated and does nothing. Use `DD_PROFILING_HEAP_SAMPLE_SIZE` to control sampling frequency of the memory profiler.