I see something weird, when I click on the last trace llm it displays no detailed information. I can only click on the previous llm trace and it displays fine. The second one does not open.
Hey Adam, I think I may understand what you are pointing out. It's that if you click outside of the link itself on the last row it only selects the trace and it doesn't highlight the llm span in the trace. Is that what you are referring to?
Hey Mikyo I think I was doing something wrong. I鈥檓 trying to figure out how to run the session and getting my external app to talk to the px client. I think I was doing something that was hanging the server. This when clicking on the last row it was making it in responsive. I think I can move on, but if it persists I鈥檒l let you know.
I just realize that I have to start a session in the notebook. The start my external app to post to that server.
Sounds good Adam. P.S. technically you don't have to start the session in the notebook to be able to talk to it. You can initialize the client with the URL of your running phoenix instance: https://docs.arize.com/phoenix/api/client#phoenix.client
i'm confused, I thought the steps to get the stream was:
Start notebook and then run session = px.launch_app()
Start my external app with llama-index and run the: set_global_handler("arize_phoenix")
The app will run and send traces to that px.launch_app() thing
the external app exits
but my session still exists, I can re-run the external app and it will send data to that
is that correct?
Yup that鈥檚 the notebook flow! Sorry I may have misread your message. Some people choose to run Phoenix as a container or in a separate process. I thought you may have done the same
Wait how do I do that? I think I want to move in that direction. I want to do this without the notebook.
When you are ready you can try out the container deployment https://docs.arize.com/phoenix/reference/deploying-phoenix We are still actively working on this so there are some moving parts. If you are using llama_index you can try out composing phoenix with create-llama here https://github.com/Arize-ai/openinference/tree/main/python/examples/llama-index
wow, I do not know how I missed this, thank you so much!
Mikyo do you know how to setup the project name when you use the: https://github.com/Arize-ai/openinference/blob/335de024b04a057f83f9c4c9aa04186b30a[鈥nce-instrumentation-llama-index/examples/chroma_query_engine.py
Hey Adam K. just a heads up we are still actively working on projects so you might be a beta tester:) but the TraceProvider takes a Resource and you can set the project there. Here's how it's done internally. https://github.com/Arize-ai/phoenix/blob/main/src/phoenix/trace/openai/instrumentor.py
i was just finding that as you wre yping
OK, one more question, and hopefully I can stop bugging you. When I use this add_span etc it keeps dumping it out to my console, is there a way to stop all that raw data dumping out?
I'm guessing you've mounted a ConsoleSpanExporter too. You can just remove that.
