Azure OpenAI Custom Endpoint Configuration Issue in Docker Setup
Hi, Hitting this issue with configuring LLM provider for Azure OpenAI custom endpoint. I am running Phoenix in Docker using my own Dockerfile which expose 6006 and 4317 and does pip install arize-phoenix openai . Docker compose file which maps the ports 6006 and 4317 with command python3 -m phoenix.server.main serve and configured the 3 env variables AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY and OPENAI_API_VERSION . In the UI, i configured the Azure OpenAI model configuration. Deployment name: gpt-4o-global-v2024-11-20, endpoint: https://<>/api/azure , api_version: 2024-10-21 . When i tried running a simple prompt in playground, it returns me Connection Error. I dont see any error logs in container as well. Sorry i cant provide any screenshots as i am behind company proxy, also the reason why i couldnt run arize phoenix docker image directly. I am suspecting some networking issues. Also, i have tried to run the curl command and the response is correct curl -X POST -H "x-api-key: $AZURE_OPENAI_API_KEY" -d {"messages": []}' <endpoint_base_url>/openai/deployments/<model_id>/chat/completions?api-version=2024-10-21
