Hi Team, I am new to Phoenix. Recently upgraded version 9.6.1 to 12.33.1. Would like to capture the Llm cost. But it seems always total cost is $0 and cost is not showing in the span hierarchy. Each llm call emits the attributes like model-name, provider, prompt count, completion count, total count. I have check the settings for model name and price calculation. But still cost is not showing. Could someone share your thoughts to understand where is the disconnect?
Phoenix requires specific span attributes for cost tracking: llm.token_count.prompt, llm.token_count.completion, llm.token_count.total, llm.model_name, and llm.provider. If any are missing or mismatched, cost will not be calculated鈥攅ven if token counts are present. Also, the model name and provider in your traces must match the patterns in your model pricing settings, including any custom entries you鈥檝e added. Regex mismatches or typos in the model name pattern can prevent cost calculation. Double-check that all required attributes are present in your spans and that your model pricing configuration matches the trace data exactly. For more, see the cost tracking documentation and related troubleshooting discussion.If all attributes and pricing settings are correct but cost is still missing, inspect the raw span data in Phoenix to confirm the attributes are present and correctly formatted. Some instrumentations or LLM providers may not emit all required fields, or may use slightly different model names. Upgrading to the latest instrumentation packages and ensuring your model pricing regex matches the exact model name in the trace often resolves the issue. Would you like a step-by-step checklist or code example for debugging this further?Sources: - Cost Tracking - Phoenix - Phoenix GitHub Issue: Cost not shown with DSPy instrumentation
馃挕聽Hint: Mention RunLLM in the thread for followups.
The trace looks mostly correct to me from what I can see in the screenshot, as well as the cost definition in model settings; I wonder if the model provider setting is not matching? Are you using custom instrumentation or one of the openinference instrumentation packages? If you are using an openinference package can you share the version?
Open inference instrumentation packages
0.1.44
Can you provide the package name as well please?
openinference-instrumentation Version=0.1.44 openinference-instrumentation-litellm version=0.1.29
thank you 馃檹 I will take a look
Thank you so much 馃檹
Interesting, I cannot replicate by sending in a minimal span of the same shape as yours, I will need to dig in further, this may take me a bit
Are you using postgres or the default sqlite database?
Postgres
were these spans existing before you upgraded phoenix?
No
Would it be possible to share more of one of your llm span attributes? I don't need to see input messages or output messages
