Hey Rusty - we've thought about having that compatibility layer with llama-index but just haven't got to it since we have our own context managers to do that https://docs.arize.com/phoenix/tracing/how-to-tracing/customize-spans#using_attributes
To add custom attributes to spans in Phoenix, you can use the OpenTelemetry (OTEL) framework, which Phoenix supports. This allows you to inject metadata into spans, such as user-specific data or other contextual information. You can achieve this by modifying the span creation or processing logic to include additional metadata. This is a common practice in distributed tracing and is supported by Phoenix's integration with OpenInference and OTEL.If you need more detailed guidance on implementing this, please let me know.Sources: - Phoenix Tracing with LlamaIndex
馃挕聽Hint: Mention RunLLM in the thread for followups.
This readme might also be helpful. https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation
Thanks. Yeah, I've been trying to follow the llama index instrumentation format, so we can layer in other consumers as well. It's probably still too early to say what's better or not though.
I can use the OpenInference methods in the meantime 馃憤馃徎
