Hi Phoenix Support - i have recently self hosted phoenix and i see couple of issues. one is i don't see the cumulative token coming and is always zero. other is i see 'unknown' spans always show up more in number eventhough i haven't hooked the particular method with the @tracer.llm
For cumulative token count always being zero, this usually means token count attributes are missing from your LLM spans. Phoenix requires that spans include llm.token_count.prompt, llm.token_count.completion, and llm.token_count.total attributes for token tracking鈥攖hese are auto-captured with supported auto-instrumentation (e.g., OpenAI, Anthropic), but must be set manually if you instrument your code yourself or use a provider that doesn't return token counts natively. See Cost Tracking docs and FAQs.For "unknown" spans, these typically appear when Phoenix receives spans without a recognized span_kind or when the instrumentation does not set the expected attributes. This can happen if you have generic spans, incomplete instrumentation, or if the tracing library is capturing more than just LLM-related calls. Review your instrumentation setup to ensure only the intended methods are traced and that all required attributes are set. See How Tracing Works for details.Would you like more detail or troubleshooting steps?Sources:
- Cost Tracking docs
- FAQs: Tracing
- How Tracing Works
馃挕聽Hint: Mention RunLLM in the thread for followups.
Phoenix Support - i see 'unknown' spans always show up more in number eventhough i haven't hooked the particular method with the @tracer.llm
Hey thanks for using Phoenix and reaching out. Can you describe your tracing setup in a bit more detail? What packages are installed in your environment and what are you trying to instrument?
Hi Tony, i have enabled pip install arize-phoenix-otelin my project and used register with phoenix otel
im trying to trace only the llm calls from my app
Which provider LLM sdk are you tracing?
Im able to see the right tracing @tracer.llmwhereever i have invoked. problem i have is its reflecting some spans with 'unknown' tags.
in these 'unknown' tags i didn't annotate with the hook
Do you happen to have a snippet of code or an example of what the traces look like within phoenix? It is difficult to tell the problem without knowing how things are setup. Typically unknown spans are produced by instrumentation that does not set the span kind attribute. Is it possible that you have automatically instrumented an LLM package at the same time that you are manually instrumenting with decorators?
we have an existing project and top of tat we are trying to integrate phoenix for LLM traces alone. We have collection of REST APIs in our project. phoenix ui is showing all the API traces which i don't want as well.
Phoenix Support - we are using fast api for REST services. Anthony P.
how to check whether its automatically instrumented?
tracer_provider = register(
endpoint=url_endpoint,
project_name=TEST_METRICS, # sets a project name for spans
batch=True, # uses a batch span processor
auto_instrument=True, # uses all installed OpenInference instrumentsHi Anthony P. - we are registering the component and then we are
tracer = tracer_provider.get_tracer(__name__)we then we use the @chain hook for specific methods