v4.6.0rc3
📦 datadog-sdkView on GitHub →
✨ 4 features🐛 5 fixes🔧 6 symbols
Summary
This release introduces inferred proxy support for Azure API Management and enables stats computation by default for Python 3.14+. Several bug fixes address tracing propagation, profiler type compatibility, and race conditions in worker threads.
✨ New Features
- Introduced inferred proxy support for Azure API Management.
- Enabled stats computation by default for Python 3.14 and above.
- Added `LLMObs.publish_evaluator()` to sync locally-defined `LLMJudge` evaluators to the Datadog UI as custom LLM-as-Judge evaluations.
- LLM Observability experiments now report execution status (`running`, `completed`, `failed`, or `interrupted`) to the backend.
🐛 Bug Fixes
- Distributed tracing headers are now propagated for Celery tasks that are not registered locally, ensuring correct trace linking across workers.
- Resolved a `TypeError` at import time in the lock profiler's wrapper class when using PEP 604 type union syntax (e.g., `asyncio.Condition | None`).
- Fixed a potential race condition in internal periodic worker threads that could lead to a `RuntimeError` during forks.
- Addressed lock contention in the profiler's greenlet stack sampler, preventing connection pool exhaustion in gevent-based applications.
- Added a timeout to Unix socket connections to prevent thread I/O hangs during pre-fork shutdown.