Does anyone have an example of using openinference to auto instrumenet langchain that includes the ability to extract the trace id? I鈥檇 like to add that trace id to our AI response metadata for feedback, but when I call trace.get_current_span().get_span_context().trace_id, I am getting 0 because the trace is invalid.
To automatically instrument LangChain with OpenInference and extract the trace ID, ensure that the OpenTelemetry setup is correctly configured. The issue of receiving a trace ID of 0 typically indicates that the span is not properly initialized or the context is not correctly propagated.1. Ensure Proper OpenTelemetry Setup: Make sure that the OpenTelemetry SDK is correctly set up and that the tracer provider is configured. This includes setting up the tracer provider and span processor.2. Use OpenInference Instrumentation: Utilize the OpenInference instrumentation for LangChain to automatically create spans. Ensure that the instrumentation is correctly applied to your LangChain components.3. Extract Trace ID: After ensuring the setup is correct, you can extract the trace ID using the following code: LBKSfrom opentelemetry import traceLBKS# Get the current span current_span = trace.get_current_span()LBKS# Get the trace ID from the current span trace_id = current_span.get_span_context().trace_idLBKS# Convert the trace ID to a hexadecimal string trace_id_hex = format(trace_id, '032x')LBKSprint(f"Trace ID: {trace_id_hex}")LBKSIf the trace ID is still 0, verify that the spans are being started and ended correctly, and that the context is being propagated across asynchronous boundaries if applicable.Would you like more detail on any specific part?Sources:
- GitHub Issue #3558
- Phoenix Documentation on Capturing Feedback
馃挕聽Hint: Mention RunLLM in the thread for followups.
The issue of receiving an invalid or zero trace ID typically indicates a problem with context propagation or span initialization. Here are some steps to troubleshoot and resolve this: 1. Ensure Correct Instrumentation: Verify that the LangChainInstrumentor().instrument() call is correctly applied before any LangChain operations are executed. This ensures that all relevant spans are created and managed by the OpenTelemetry instrumentation.2. Check Context Propagation: Ensure that the OpenTelemetry context is correctly propagated across all parts of your application. If your application involves asynchronous operations or threading, make sure that the context is properly managed and passed along.3. Verify Tracer Provider Setup: Confirm that the OpenTelemetry tracer provider is correctly set up and that a span processor is added. This setup is crucial for managing and exporting spans.4. Extract Trace ID: Use the following code to extract the trace ID, ensuring that the current span is valid: LBKSfrom opentelemetry import traceLBKS# Get the current span current_span = trace.get_current_span()LBKS# Check if the span is valid if current_span.is_recording(): # Get the trace ID from the current span trace_id = current_span.get_span_context().trace_idLBKS # Convert the trace ID to a hexadecimal string trace_id_hex = format(trace_id, '032x') print(f"Trace ID: {trace_id_hex}") else: print("Current span is not valid.")LBKSIf the trace ID remains zero, it may indicate that the span is not being correctly started or that the context is not being propagated. Double-check the setup and ensure that the instrumentation is applied at the correct points in your application.Would you like more detailed guidance on any specific aspect?Sources:
- https://arize-ai.slack.com/archives/C04R3GXC8HK/p1714876890.678039
- https://arize-ai.slack.com/archives/C04R3GXC8HK/p1734948344.476019
馃挕聽Hint: Mention RunLLM in the thread for followups.
Ensure that the OpenTelemetry context is correctly propagated across all parts of your application. If your application involves asynchronous operations or threading, make sure that the context is properly managed and passed along.
I have a feeling that this is the issue. Now I just have to figure out how to do that 馃檪
I want to store the trace (or span) ID in my langchain AIMessage additional_kwargs to empower user feedback on specific interactions with the AI
Basically, just grab the id and store it like this doc shows: https://docs.arize.com/phoenix/tracing/how-to-tracing/capture-feedback The challenge I鈥檓 running into is that we鈥檙e using the langchain auto instrumentation which appears to complicate things
instead of using get_current_span from otel, can you try this way?
from from openinference.instrumentation.langchain import get_current_span
if span := get_current_span():
trace_id = get_span_context().trace_idI鈥檝e been able to confirm that this works, however, for future readers who find this thread, a different import from opentelemetry was a better experience.
from openinference.instrumentation.langchain import get_ancestor_spansCalling get_ancestor_spans gives you an array of all ancestors to the current span, so you can get the root span simply by calling get_ancestor_spans()[-1].get_span_context().span_id.to_bytes(8, "big").hex() From there, this value can be directly sent to the span annotations endpoint. By getting the root span, this shows up both on the traces view in phoenix as well as the spans view, so it鈥檚 a bit easier to locate
