Hi Team, I'm trying to set up tracing using agno. I've successfully managed to set it up for agents that use OpenAI models, but for AWS Bedrock models, when streaming is enabled, the output is not "prettified." Specifically, with streaming enabled, I receive text chunks (as shown in the attached picture). When I disable streaming, the traces look correct and display the actual LLM output as expected, rather than JSON. Can someone please help me resolve this? P.S. Great booth at AI Engineer Paris鈥擨'm here because of it!
馃挕聽Hint: Mention RunLLM in the thread for followups.
I have the following packages installed in my requirements.txt
arize-phoenix==11.37.0
openinference-instrumentation-agno==0.1.15
arize-phoenix-otel==0.13.1
opentelemetry-exporter-otlp==1.37.0
openinference-instrumentation-openai==0.1.33
openinference-instrumentation-bedrock==0.1.27
openinference-instrumentation-anthropic==0.1.19馃挕聽Hint: Mention RunLLM in the thread for followups.
Hey Andrei I. a human here 馃檪 we co-maintain the instrumentation with the Agno team under this repo. Definitely file us an issue and I can raise it with their team. https://github.com/Arize-ai/openinference Thanks for reporting!
No problem, keep the feedback comin.
Mikyo here is the issue: https://github.com/Arize-ai/openinference/issues/2236 (in case that sending it here might speed up a bit the process馃槀). Thanks again for the help.
