Skip to content

[BUG]: LLMObs spans no longer grouped into traces #15411

@israel-tk

Description

@israel-tk

Tracer Version(s)

3.18.1

Python Version(s)

3.11.7

Pip Version(s)

pip 25.3

Bug Report

I am using the LLMObservability plugin and the OpenAI Responses API. AWS lambda functions using the official DD layer.

For some reason, the LLM spans are not being grouped into traces in the UI for LLM Observability, even though if I go to APM I see them grouped into the trace.

In previous ddtrace version, 2.17.0, and the OpenAI completions API, I see the LLM spans correctly grouped into traces.

Reproduction Code

No response

Error Logs

No response

Libraries in Use

No response

Operating System

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions