Hi, Is it possible to connect phoenix client over https to a domain? for example if I want to connect with "https://phoenix-demo.arize.com" is it possible?
Yes, it is possible to connect the Phoenix client over HTTPS to a domain. You can specify the endpoint URL directly when initializing the px.Client. LBKSimport phoenix as pxLBKSpx_client = px.Client( endpoint="[https://phoenix-demo.arize.com](https://phoenix-demo.arize.com)", )LBKSWould you like more details on this setup?Sources:
- Slack Discussion
馃挕聽Hint: Mention RunLLM in the thread for followups.
RunLLM I'm getting 3 issues when I connect with it.
HTTPStatusError: Client error '403 Forbidden' for url
httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)
sometimes its saying that arize is not deployed on that endpoint but I can see the UI.
To address these issues, follow these steps: 1. 403 Forbidden Error:
- Ensure that the endpoint URL is correct and accessible.
- Verify that the necessary permissions are set up correctly on the server.2. SSL Certificate Verification Error:
- This error occurs due to SSL certificate issues. You can disable SSL verification by setting verify=False in the px.Client initialization, but this is not recommended for production environments due to security risks.
import httpxLBKS class CustomClient(px.Client):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._client = httpx.Client(verify=False)LBKS px_client = CustomClient(
endpoint="[https://phoenix-demo.arize.com](https://phoenix-demo.arize.com)",
)
```LBKSLBKS3. **Arize Not Deployed Error**:
- Ensure that the Phoenix server is running and accessible at the specified endpoint.
- Use internal DNS or a direct IP address if you are within a VPC or internal network.LBKSWould you like more details on any specific issue?LBKSSources:
- [Slack Discussion](https://arize-ai.slack.com/archives/C04R3GXC8HK/p1724440714.710219)
馃挕聽Hint: Mention RunLLM in the thread for followups.
Hey Anuraag T., https://phoenix-demo.arize.com/model is a demo deployment of Phoenix. It doesn't accept client connections or writes.
Maybe you're looking for https://llamatrace.com/login?
RunLLM can I get the payload for /v1/traces api, is there a swagger document for it?
Phoenix does not provide a specific Swagger document for the /v1/traces API. However, you can refer to the OpenTelemetry documentation for the structure of trace payloads, as Phoenix uses OpenTelemetry for tracing.For example, a typical trace payload in JSON format might look like this: LBKS{ "resourceSpans": [ { "resource": { "attributes": [ { "key": "service.name", "value": { "stringValue": "your-service-name" } } ] }, "instrumentationLibrarySpans": [ { "spans": [ { "traceId": "4bf92f3577b34da6a3ce929d0e0e4736", "spanId": "00f067aa0ba902b7", "name": "your-span-name", "startTimeUnixNano": 1589919260000000000, "endTimeUnixNano": 1589919261000000000, "attributes": [ { "key": "http.method", "value": { "stringValue": "GET" } } ] } ] } ] } ] }LBKSWould you like more details on how to structure or send these payloads?Sources:
- https://arize-ai.slack.com/archives/C04R3GXC8HK/p1721154534.358069
- https://docs.arize.com/phoenix/tracing/llm-traces-1
Note: I wasn鈥檛 able to find highly relevant data sources, but above are a few potentially relevant links.
馃挕聽Hint: Mention RunLLM in the thread for followups.
Hey Anuraag T., we don鈥檛 recommend manually constructing the payload for that route. Our instrumentation will create the appropriate payload for you.
Sure I would not touch that, although through some hassle and minor tweaking I'm able to pass my ssl ca bundle using certifi lib and its working fine for me, I can how connect phoenix client with my server.
Awesome!
