How does the Trace Details view decide which spans to include? Recently we've run into an issue where clicking on a root span opens Trace Details with multiple other root spans included. The other root spans are from tasks started by the same worker in our app, but the actual spans are each created in separate child worker contexts, so I'm not sure how these are being grouped like this. 馃
The Trace Details view in Phoenix decides which spans to include based on the trace ID. All spans that share the same trace ID are grouped together in the Trace Details view. This means that if multiple root spans have the same trace ID, they will be displayed together, even if they originate from separate contexts or tasks. This behavior is consistent with the OpenTelemetry model, where a trace is a collection of spans that share a common trace ID, representing a single operation or transaction across different services or components.If you are seeing multiple root spans grouped together unexpectedly, it might be due to the way trace IDs are being propagated or generated in your application. Ensure that each distinct operation or task that should be treated as a separate trace has a unique trace ID. You can achieve this by correctly managing the trace context propagation across different worker contexts. For more details on managing trace context, you can refer to the OpenTelemetry documentation.Would you like more detail on how to manage trace context propagation or any other aspect of tracing in Phoenix?Sources: - https://arize-ai.slack.com/archives/C04R3GXC8HK/p1724796536.922089 - https://github.com/Arize-ai/phoenix/blob/main/app/src/pages/trace/TraceDetails.tsx
馃挕聽Hint: Mention RunLLM in the thread for followups.
In Python, when using LlamaIndex with Phoenix, the trace ID is typically determined by the OpenTelemetry instrumentation. The LlamaIndexInstrumentor is used to automatically create and manage spans and traces. When a new trace is initiated, a unique trace ID is generated by the OpenTelemetry SDK. This trace ID is then propagated across different spans within the same trace to ensure they are grouped together.The trace ID is generated when a new span is started, and it is part of the span context. If you are manually creating spans, you can use the OpenTelemetry API to start a new span, which will automatically generate a new trace ID if one does not already exist in the current context. For more details on setting up tracing with LlamaIndex, you can refer to the LlamaIndex Instrumentation Guide.Would you like more detail on how to manually manage trace IDs or any other aspect of tracing with LlamaIndex?Sources:
- LlamaIndex Instrumentation Guide
馃挕聽Hint: Mention RunLLM in the thread for followups.
OK, points me in a general direction at least. Thanks.
I think this might be an interaction between our opentelemetry for perf analysis and openinference for llm analysis. The telemetry code for our worker library appears to be serializing context from the enqueue side and setting it on the worker task. Maybe I can reset the openinference trace id in the child tasks. 馃 In some ways it's useful to have them grouped this way, but it's making navigation in the phoenix UI more challenging right now. 馃槄
It looks like the opentelemetry code is setting a "traceparent". Anyone happen to know of a way I could erase that for openinference, without affecting our other tracing?
It looks like I might be able to erase it completely with something similar to this example, but it would be cool if I could control when the two telemetry purposes intersect. https://opentelemetry.io/docs/languages/python/cookbook/#manually-setting-span-context
This is what I ended up doing for now:
token = context.attach(trace.set_span_in_context(INVALID_SPAN))
try:
// llm stuff
finally:
context.detach(token)In this case, the hierarchy isn't coming from llama-index, it's coming from the opentelemetry package we use for arq (a python worker library). We have that package installed for perf tracing, and the perf spans caused the llm spans to be grouped in an unintended way.
