Hi,
I want to implement API Gateway or API Proxy in the playground. Can we implement API Proxy ?if yes? How?
I have this
client = OpenAI( api_key="xxxxxxxxx", base_url="https://xyz.com/v1")
Thank you 馃檪.
Phoenix Playground does not natively support acting as an API proxy or gateway to capture OpenTelemetry traces or span data without modifying your application code. Phoenix and OpenInference require instrumentation within the application itself; they do not provide a network-level proxy or agent that can extract telemetry just by routing traffic through a proxy, unlike HTTP proxies such as NGINX or Apache聽see discussion.You can configure the Playground to use any OpenAI-compatible endpoint (such as your API gateway) by setting the base_url in the Playground UI or via the OPENAI_BASE_URL environment variable. However, this only routes LLM requests through your gateway; it does not enable Phoenix to capture traces unless your application is instrumented to emit OpenTelemetry data聽docs聽example.Would you like more detail on how to instrument your application for Phoenix, or on configuring custom endpoints in the Playground?Sources:
- https://github.com/Arize-ai/phoenix/issues/8380
- docs
- example
馃挕聽Hint: Mention RunLLM in the thread for followups.