Hello guys, if I use LangChainInstrumentor, how I can https://docs.arize.com/phoenix/tracing/how-to-tracing/feedback-and-annotations/capture-feedback capture user feedback? The problem is I'm not able to get span id by using LangChainInstrumentor. 馃ズ
To capture user feedback using LangChainInstrumentor, you need to retrieve the current span ID and then send annotations to Phoenix. You can use the get_current_span method from the openinference.instrumentation.langchain module to get the current span ID. Once you have the span ID, you can send feedback as annotations to Phoenix using its REST API.Here's a brief overview of the process: 1. Retrieve Current Span ID: Use get_current_span to get the current span ID.
2. Send Annotations: Use the Phoenix API to send annotations to the /v1/span_annotations endpoint. The annotation should include the span ID, feedback label, score, and any additional metadata.Would you like more detail on how to implement this in your application?Sources:
- Capture Feedback
馃挕聽Hint: Mention RunLLM in the thread for followups.
Hey Junqi, this should be better documented (cc John G. when you are back) - but runllm is correct in that you can get the current span of a run using a utility in the langchain instrumentor. See https://github.com/Arize-ai/openinference/blob/2c7bd4af321504cdd7261411fe62d889c37[鈥ngchain/src/openinference/instrumentation/langchain/__init__.py
so if you have the run_id, you can get the span
the Send Annotations to Phoenix section in the doc page above has a code snippet for LangChain regarding span_id
from opentelemetry.trace import format_span_id
from openinference.instrumentation.langchain import get_current_span
span = get_current_span()
if span is not None:
span_id = format_span_id(span.get_span_context().span_id)from opentelemetry.trace import format_span_id
from openinference.instrumentation.langchain import get_current_span
span = get_current_span()
if span is not None:
span_id = format_span_id(span.get_span_context().span_id)I actually had this snippet in code. It neither works inside langgraph nor works outside.
