Hi guys - unsure if I鈥檓 in the right channel 馃槂 Looks like the docs at this link is missing a link to extra documentation https://arize.com/docs/phoenix/learn/tracing/annotations-concepts#feedback-from-end-users
For more information on how to wire up your application to collect feedback from your users, seeI鈥檓 curious what that would鈥檝e linked to, as I鈥檇 like to wire my Phoenix up to store user feedback. Thanks!
You are correct鈥攖he documentation at that link references "For more information on how to wire up your application to collect feedback from your users, see" but does not provide the actual link. However, to wire up Phoenix to store user feedback (such as thumbs up/down), you should use the Phoenix SDK or REST API to attach feedback as annotations to spans or traces. The official guide for this is available at Annotating via the Client, which includes example code for Python, TypeScript, and curl.In summary: use the Phoenix client or REST API to send feedback as annotations, specifying the span ID, annotation name, annotator kind (e.g., HUMAN), label, score, and optional metadata. Would you like a step-by-step code example for your use case?Sources: - https://arize.com/docs/phoenix/learn/tracing/annotations-concepts#feedback-from-end-users - https://arize.com/docs/phoenix/tracing/how-to-tracing/feedback-and-annotations/capture-feedback
馃挕聽Hint: Mention RunLLM in the thread for followups.
