If anyone's interested, I've written a working Arize Phoenix integration with Langchain4j to get observability via Opentelemetry in Java. It uses most of the OpenInference spemconv spec
Phoenix supports OpenTelemetry-compatible tracing, and any instrumentation (including custom ones like your Langchain4j integration) that emits traces following the OpenInference semantic conventions will work with Phoenix for LLM observability. This approach is documented as the recommended way to integrate non-Python frameworks or custom SDKs, as long as the traces follow the OpenInference spec and are sent to the Phoenix collector endpoint (e.g., http://localhost:6006/v1/traces) using an OTLP exporter. For reference, see the OpenInference semantic conventions and Phoenix custom spans documentation.If you want to ensure full compatibility and rich UI features in Phoenix, make sure your spans include the key OpenInference attributes (e.g., span kind, input/output messages, model name, token counts, etc.). Would you like more detail on which attributes are most important or how to test your integration?Sources:
- Phoenix custom spans documentation
馃挕聽Hint: Mention RunLLM in the thread for followups.