Hi Team, I am new to Phoenix. Recently upgraded version 9.6.1 to 12.33.1. Would like to capture the Llm cost. But it seems always total cost is $0 and cost is not showing in the span hierarchy. Each llm call emits the attributes like model-name, provider, prompt count, completion count, total count. I have check the settings for model name and price calculation. But still cost is not showing. Could someone share your thoughts to understand where is the disconnect?
馃挕聽Hint: Mention RunLLM in the thread for followups.
The trace looks mostly correct to me from what I can see in the screenshot, as well as the cost definition in model settings; I wonder if the model provider setting is not matching? Are you using custom instrumentation or one of the openinference instrumentation packages? If you are using an openinference package can you share the version?
Open inference instrumentation packages
0.1.44
Can you provide the package name as well please?
openinference-instrumentation Version=0.1.44 openinference-instrumentation-litellm version=0.1.29
thank you 馃檹 I will take a look
Thank you so much 馃檹
Interesting, I cannot replicate by sending in a minimal span of the same shape as yours, I will need to dig in further, this may take me a bit
Are you using postgres or the default sqlite database?
Postgres
were these spans existing before you upgraded phoenix?
No
Would it be possible to share more of one of your llm span attributes? I don't need to see input messages or output messages
