hello, we’re looking at Arize for Agentic observability, I know there is support for tracing images, but this support doesn’t include out-of-band upload of base64 images to blob storage, is this something that you are working on? Or is it somehow already supported? We’re testing Arize right now, but with base64 images cause a lot of issues in the framework, and also cause the whole UI to slow down. It would be good to understand if this can be supported.
So this is what I think most other providers do. E.g. weave or langfuse. It is explained like this in langfuse docs:
By default, base64 encoded data URIs are handled automatically by the Langfuse SDKs. They are extracted from the payloads commonly used in multi-modal LLMs, uploaded to Langfuse’s object storage, and linked to the trace.
&
File uniqueness determined by project, content type, and content SHA256 hash
https://langfuse.com/docs/observability/features/multi-modality Having something similar to this in Arize would make sense to me.
I use litellm, with the openinference instrumenter. Langfuse is more an example where they explain how they manage it.
