Ciao Luca B.! You can do this by configuring the following the environment variable PHOENIX_COLLECTOR_ENDPOINT to point to the server running in a different process or container. https://docs.arize.com/phoenix/environments
Hey Luca! You certainly could! You are looking to trace things that wrap LlamaIndex? What are you looking to instrument? If so you can certainly publish your own spans via the tracer - We are definitely going to support more custom instrumentation but let us know if you hit any walls! https://github.com/Arize-ai/phoenix/blob/main/src/phoenix/trace/tracer.py