Hi everyone. With Mastra deprecating their current OTel based telemetry and moving to their AI-tracing, is there a plan to support that in near future? Currently, they only support Langfuse (created a @mastra/langfuse library). Moreover, the current OTel-based telemetry integration with Phoenix is broken (with streamVNext and generateVNext) - not getting the core AI spans and the rest agent.* spans are pretty useless.
Phoenix does not currently support Mastra's new AI-tracing protocol; integration is only available via OTel-based telemetry, which is now deprecated and partially broken for Mastra's latest APIs (streamVNext, generateVNext). There is no official announcement or timeline for Phoenix to support Mastra's new AI-tracing protocol, and current documentation and code examples only cover OTel-based integration, which is known to miss core AI spans with recent Mastra versions (Mastra Tracing Docs, Mastra Observability Docs, OpenInference Mastra README).If you need full tracing with Mastra's latest agent APIs, you will need to wait for either Mastra to add OTel support to their new protocol or for Phoenix/OpenInference to add native support for Mastra's AI-tracing. For now, only Langfuse is officially supported by Mastra's new tracing system (Mastra Observability Docs). Would you like more detail on current workarounds or how to monitor for future updates?Sources:
- Mastra Observability Docs
- Mastra Tracing Docs
- OpenInference Mastra README
💡 Hint: Mention RunLLM in the thread for followups.
I've been taking a look at this a bit as my org prepares to upgrade mastra versions. I am happy to be involved via code contributions or discussions! Just let me know
So Mastra is coming out with OTEL support for AI tracing. Given that I lean towards waiting till they push this out since this will make the integration easier to maintain on our end. But let me know if we think there's room for a first-party integration
I am very happy to hear that they will be adding OTEL. I agree that feels like the right way to go
Has there been any further discussion on this topic? It appears mastra plans to deprecate the current tracing on November 4th
They’ve recently released OTEL support for the new AI tracing feature. I tested it and successfully pushed traces to Phoenix. While all the trace data appears to be present in Phoenix, the visualization isn’t as polished as what you get through the AI SDK - for example, the same raw JSON appears in both the Info and Attributes tabs. Their AI Tracing documentation also mentions “Arize – Coming Soon”, though not sure if it's simply a convenience wrapper around OTEL integration.
Did you export using the openinference library or directly from mastra to phoenix? Glad to hear that its not super far off
is this something the arize team is working on or mastra?
