Hi Team .. I am trying to deploy the phoenix through docker and as a side car. I followed the example provided here (https://github.com/Arize-ai/openinference/tree/main/python/examples/llama-index) . I have created instrument.py and initialized it in my main API . Docker compose looks like this below image. Whenever I run the queries through streamlit, I am not seeing any trace on the phoenix server. If I run as a separate container (only phoenix) and streamlit and backend api without docker, I can see the traces and all. I am not sure, where I am doing it wrong.
I might be wrong but it looks like there’s probably a networking issue in your compose file? Does it work without or by specifying it like this? https://docs.docker.com/compose/networking/
I tried this approach. I am getting the same error. "requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=6006): Max retries exceeded with url: /v1/traces (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb53191c610>: Failed to establish a new connection: [Errno 111] Connection refused'))"
Here you go.
Huh, I’m not quite sure what’s wrong. You are using OpenInference-instrumentation-llama-index I would assume. The only way I can think of that you would get localhost is if you used Phoenix in the assistant code. Sorry I couldn’t figure it out. Let us know if you have any more details that might help us troubleshoot
