With a trace like this, is it possible to annotate individual documents? The docs suggest so here (https://arize.com/docs/phoenix/tracing/concepts-tracing/annotations-concepts#annotation-targets):
Document Annotations: Specifically for retrieval systems, evaluating individual documents with metrics like relevance and precision
But I don鈥檛 see how it鈥檚 possible in the UI.
Yes, Phoenix supports annotating individual documents (document annotations) for retrieval systems, allowing you to evaluate each document with metrics like relevance and precision, as described in the documentation (Annotations Concepts). However, the UI currently focuses on span-level annotations, and direct document-level annotation via the UI is not explicitly documented or available.To annotate individual documents, you should use the Phoenix client SDKs (Python or TypeScript) to log document annotations programmatically, specifying the span_id and document_position for each annotation (LLM Evaluations: Document Evaluations). Would you like more detail or a code example for programmatic document annotation?Sources:
- Annotations Concepts
- LLM Evaluations: Document Evaluations
馃挕聽Hint: Mention RunLLM in the thread for followups.
yes. are there plans to support this in the UI?
There are plans and ongoing work to support document-level annotation in the Phoenix UI, but as of now, this feature is not fully available. Recent updates and GitHub discussions confirm that UI enhancements for annotations are being actively developed, with span-level annotation UI already released and document annotation support mentioned as a requested and tracked enhancement (GitHub Issue #6927, Release Notes, PR #3914).Would you like more detail on the current status or how to follow progress on this feature?Sources: - GitHub Issue #6927 - Release Notes - PR #3914
馃挕聽Hint: Mention RunLLM in the thread for followups.
hey Andrew S. this is probably not that hard given the state of AI these days. Can you give us a bit more of the type of annotations you'd want to add. Are they basically relevance per document? You need multiple annotations per document? I'll fire off an agent to work - just need to better understand what UX you'd like
Right now it鈥檚 simply evaluating precision and recall for retrieval. So false negative/false positive with an optional note is all I really need at the moment. https://www.loom.com/share/b4c322552955434a9059c17e22438e30
