Has anyone compared Arize AI Phoenix with Langfuse (self-hosted deployments) for LLM observability? What would be the pros and cons of picking either solution?
Does Phoenix support Google Gemini API?