Hi, is there a way to see llm model temperature in the trace/span in UI?
Yes, Phoenix traces capture LLM parameters such as temperature, and you can inspect these parameters鈥攊ncluding temperature鈥攊n the trace details within the UI. This allows you to validate and debug the configuration used for each LLM call, as described in the documentation and shown in trace attribute examples (source, source, source).To view the temperature, open a trace in the Phoenix UI, select the relevant LLM span, and look for the llm.invocation_parameters or similar attribute, which will include the temperature value used for that call. Would you like more detail?Sources:
- Tracing Overview: Tracing
- docs/tracing/llm-traces.md
- semantic_conventions.md
馃挕聽Hint: Mention RunLLM in the thread for followups.
Hi 馃敀[private user] I use "openinference-instrumentation-openai>=0.1.30" Is this the right place I am looking for temperature? invocation_parameters
Yes, that should be where temperature is recorded, was it passed in explicitly into the SDK call?
I supposed yes, but let me double check
Great, if it鈥檚 being passed in and it鈥檚 still not recorded please file an issue and we鈥檒l investigate
Works good, thanks
great, I'm glad! Sorry we can't instrument this eagerly, as server defaults aren't visible to us
