Also what's the difference between using the litellm callbacks and following the instructions in the phoenix docs on how to use litellm with the otel provider etc?
The LiteLLM "arize" callback integration is designed to send traces to the Arize SaaS platform, not to self-hosted Phoenix. In contrast, the Phoenix documentation instructs you to use OpenInference's LiteLLM instrumentation with OpenTelemetry (OTEL), which sends traces directly to your Phoenix instance (local or cloud) and is the officially supported method for Phoenix observability with LiteLLM. The OTEL approach provides more flexibility, compatibility, and is actively maintained for Phoenix use cases[^1][^2][^3].If you use the LiteLLM "arize" callback with Phoenix, traces may not appear or may cause errors, as this integration is not intended for Phoenix and is maintained by the LiteLLM team, not Arize. For Phoenix, always use the OpenInference OTEL instrumentation as described in the Phoenix docs and OpenInference README.Would you like more detail or a code example?Sources: - Phoenix LiteLLM Integration Docs - OpenInference LiteLLM Instrumentation README - Phoenix Issue: UI unable to render the trace of litellm span
馃挕聽Hint: Mention RunLLM in the thread for followups.
Xander S. I was referring to these: https://docs.litellm.ai/docs/observability/phoenix_integration.
