v4.1.0rc3
Breaking Changes📦 datadog-sdkView on GitHub →
⚠ 2 breaking✨ 28 features🐛 18 fixes⚡ 3 deprecations🔧 17 symbols
Summary
This release removes the aioredis integration and drops support for 32-bit Linux, while introducing extensive new features for Python profiling (including Python 3.14 support) and significant enhancements to LLM Observability, including multi-run experiment tracking and richer span tagging.
⚠️ Breaking Changes
- 32-bit linux is no longer supported. Users must migrate to a 64-bit environment.
- The aioredis integration has been removed. Users relying on this integration must find an alternative or remove the dependency.
Migration Steps
- If you are using 32-bit linux, you must migrate to a 64-bit environment.
- If you were using the `aioredis` integration, remove it as it has been removed.
- If you were configuring Tornado tracing programmatically via `ddtrace.contrib.tornado`, update your configuration to use environment variables or `import ddtrace.auto`.
- If you are using Tornado, ensure you are running version v6.1 or later.
- When using LLM Observability with multi-run experiments, use the `ExperimentResult.runs` attribute instead of `rows` or `summary_evaluations` to access results.
- When running Ray jobs, ensure you use `ddtrace-run` to start your Ray cluster (e.g., `DD_PATCH_MODULES="ray:true,aiohttp:false,grpc:false,requests:false" ddtrace-run ray start --head`) if `ray.init()` is not explicitly called at the top of the script.
✨ New Features
- LLM Observability: Experiments spans now contain metadata from the dataset record.
- LLM Observability: Experiments spans' input, output, expected_output fields are now emitted as is, allowing object data to be searchable in Datadog.
- LLM Observability: Experiments spans and children spans are now tagged with human readable names (`dataset_name`, `project_name`, `project_id`, `experiment_name`) for better analysis.
- Profiling: Added support for `threading.BoundedSemaphore` locking type profiling in Python.
- Profiling: Added support for `threading.Semaphore` locking type profiling in Python, correctly marking internal locks.
- Profiling: Added support for Python 3.14 in the Continuous Profiler.
- Profiling: Added the `process_id` tag to profiles, containing the current process ID (PID).
- Profiling: The stack sampler supports async generators and `asyncio.wait`.
- Profiling: Shows fully qualified name of functions using `codeobject.co_qualname` in memory profiler and lock profiler flamegraphs for Python 3.11+.
- Profiling: Introduces tracking for the `asyncio.as_completed` util in the Profiler.
- Profiling: Introduces tracking for `asyncio.wait` in the Profiler.
- AAP: Attach Application and API Protection findings on API Gateway inferred spans to enable AppSec API Catalog coverage of lambda functions.
- AAP: Introduces proper support for API10 for redirected requests on urllib3.
- Anthropic: Adds support for the Anthropic Beta client API (`client.beta.messages.create()` and `client.beta.messages.stream()`), requiring Anthropic client version 0.37.0 or higher.
- Aiokafka: Adds DSM instrumentation support.
- Aiokafka: Adds instrumentation support for `aiokafka>=0.9.0`.
- Added support for uWSGI with gevent when threads are also patched; the use of `thread=False` is no longer required with `gevent.monkey.patch_all`.
- LLM Observability: Reasoning token counts are now captured from Google GenAI responses.
- LLM Observability: OpenAI integration now captures prompt metadata (id, version, variables, and chat template) for reusable prompts when using the `responses` endpoint (OpenAI SDK >= 1.87.0).
- LLM Observability: Experiments can now be run multiple times using the optional `runs` argument, accessible via the new `ExperimentResult.runs` attribute.
- LLM Observability: Non-root experiment spans are now tagged with experiment ID, run ID, and run iteration tags.
- LLM Observability: Adds additional tags to MCP client session and tool call spans for LLM Observability MCP tool call features.
- LLM Observability: Reasoning token counts are now captured from OpenAI and OpenAI Agents responses.
- LLM Observability/OpenAI: Introduces support for capturing server-side MCP tool calls invoked via the OpenAI Responses API as a separate span.
- Langchain: Adds support for tracing `RunnableLambda` instances.
- MCP: Marks client mcp tool call spans as errors when the corresponding server tool call errored.
- Crashtracker: Introduces a fallback to capture runtime stack frames when Python's `_Py_DumpTracebackThreads` function is not available.
- ASGI: Enable context propagation between websocket message spans.
🐛 Bug Fixes
- Avro: Fixes an issue where Avro instrumentation does not return method results when DSM is enabled.
- Crashtracker: Fixes missing env variables inheritance for receiver process.
- Dynamic Instrumentation: Fix issue with line probes matching the wrong source file when multiple source files from different Python path entries share the same name.
- Dynamic Instrumentation: Uploading snapshots now retries on all HTTP error codes.
- Exception Replay: Fixed the order in which frames are captured to ensure values of frames close to the initial exception are attached to relevant spans.
- Exception Replay: Fixed an infinite loop that could cause memory leaks when capturing exceptions, and improved overall speed and memory performance.
- Exception Replay: Ensure exception information is captured when exceptions are raised by the GraphQL client library.
- Code Security: Fixes critical memory safety issue in IAST when used with forked worker processes (MCP servers with Gunicorn and Uvicorn) by preventing segmentation faults due to stale PyObject pointers.
- OpenAI: Resolves an issue where instantiating an OpenAI client with a non-string API key resulted in parsing issues.
- Tracing: Fixed a potential `IndexError` in partial flush when the finished span counter was out of sync with actual finished spans.
- Tracing: `DD_TRACE_PARTIAL_FLUSH_MIN_SPANS` values less than 1 now default to 1 with a warning.
- Tracing: Resolves a potential deadlock when forking.
- Tracing/CI Visibility: Ensure the http connection is correctly reset in all error scenarios.
- Ray: Resolves an issue where Ray jobs not explicitly calling `ray.init()` were not properly instrumented; recommends using `ddtrace-run`.
- AAP: Fixes an issue where the appsec layer was not compatible anymore with the lambda/serverless version of the tracer.
- Lib Injection: Do not inject into the `gsutil` tool.
- LLM Observability: Fixes an issue where `LLMObs.export_span()` would raise when LLMObs is disabled.
- LLM Observability: Resolves an issue where `self` was being annotated as an input parameter using LLM Observability function decorators.
🔧 Affected Symbols
aioredistornadoddtrace.contrib.tornadoExperimentResultthreading.BoundedSemaphorethreading.Semaphorethreading.Conditionthreading.Lockcodeobject.co_qualnameasyncio.waitasyncio.as_completedclient.beta.messages.createclient.beta.messages.streamaiokafkagevent.monkey.patch_allLLMObs.export_spanRunnableLambda⚡ Deprecations
- Support for Tornado versions older than v6.1 is deprecated. Users should upgrade to Tornado v6.1 or later.
- Programmatic tracing configuration via the `ddtrace.contrib.tornado` module is deprecated. Users should configure tracing using environment variables and `import ddtrace.auto` instead.
- The `ExperimentResult` class' `rows` and `summary_evaluations` attributes are deprecated and will be removed in the next major release. Use the `ExperimentResult.runs` attribute instead to access experiment results and summary evaluations for multi-run experiments.