Unfortunately RunLLM couldn't help me answering the first question related to getting the span_id. My generate_answer.py file contains the following code which is called from a FastAPI endpoint in main.py :
async def generate_llm_answer() -> str:
from opentelemetry import trace
prompt = ChatPromptTemplate.from_messages(
[
("system", prompts.SYSTEM_PROMPT),
("human", prompts.EXPANDED_PROMPT),
]
)
rag_chain = ...
answer = await rag_chain.ainvoke(summarized_question)
span = trace.get_current_span()
span_id = span.get_span_context().span_id.to_bytes(8, "big").hex()
return answerBut this doesn't get a valid span_id, because span is a NonRecordingSpan.
I've tried the LLM answer, but that seems to generate a new span with a new ID which doesn't match the one already recorded by the LangChainInstrumentor