I used Phoenix to get tracing for my agno agent but i am not able to get the tokens count in each call
Phoenix supports token count tracing for many LLM providers and frameworks, but token counts may not appear if the underlying agent or integration (such as Agno) does not expose or propagate token usage information in its spans. There is no explicit documentation or example confirming token count support for Agno agent tracing in Phoenix, and missing token counts is a known limitation for some integrations, as seen with other frameworks like LlamaIndex and VertexAI unless additional setup is performed or the integration is updated to pass token counts through the callback hooks or span attributes (source, source, source).If you require token counts, check if your Agno agent or its LLM backend exposes token usage, and ensure the Phoenix instrumentation is capturing those attributes. Otherwise, you may need to manually add token count information to the spans or request support from the Agno or Phoenix maintainers (source, source). Would you like more detail on how to manually add token counts or check your integration?Sources: - Phoenix issue: llamaIndex token count capture - Phoenix issue: DSPy connector not showing the token count - OpenInference issue: serialization error and missing token counts with llamaindex and gemini
馃挕聽Hint: Mention RunLLM in the thread for followups.
I am using OpenAI
If you add https://pypi.org/project/openinference-instrumentation-openai/ you should get them I believe
