Yes, Phoenix supports tracing and observability for Claude models (Anthropic) and OpenAI Codex via its integrations. For Claude, you can instrument LLM calls using the Anthropic integration, and for Codex, you can use the OpenAI integration—both will stream traces to Phoenix for observability and evaluation. See the Anthropic integration and OpenAI integration documentation for setup details.Would you like more detail on how to set up tracing for either provider?Sources:
- https://github.com/Arize-ai/openinference/issues/1862
- https://github.com/Arize-ai/phoenix/issues/8380
💡Hint: Mention RunLLM in the thread for followups.