Hi! I have set up a self-hosted Phoenix instance. Tracing works perfectly, but I'm struggling to set up the prompt playground. I have to make it use custom models hosted in LiteLLM. I did set OPENAI_BASE_URL and OPENAI_API_KEY which seems to work out because when trying to submit a prompt with a standard OpenAI model, it errors with "selected model does not exists", which is correct. I tried to add a new model (e.g. claude-3-5-sonnet ) with the OpenAI provider, but I can't get it to show up in the model dropdowns in the playground or prompt workshop. Should this work? And if yes, does anybody have a suggestion how to debug it?
Yes, Phoenix's prompt playground supports custom models via OpenAI-compatible endpoints like LiteLLM by setting OPENAI_BASE_URL and OPENAI_API_KEY. However, adding a custom model in Settings β Models only affects cost tracking and pricingβit does not make the model appear in the playground dropdown. The playground model dropdown is populated from a fixed list per provider, and currently does not dynamically include custom entries added in the UI or via regex patterns. You can still use your custom model by manually typing its name in the model field if the UI allows, or by passing it as a string if supported by your Phoenix version (recent enhancement).If your Phoenix UI does not allow free-form model name entry, this is a known limitation. As a workaround, ensure your LiteLLM endpoint is reachable and that the model name exactly matches what LiteLLM expects. For debugging, check browser console logs for UI errors, and verify network requests to your LiteLLM endpoint. For more details, see Phoenix docs on configuring AI providers and related GitHub issue.Would you like more detail on how to enable free-form model entry or on debugging LiteLLM integration?Sources:
- Enhancement: Allow for passing in a string as a model name for providers with fixed model lists
- Phoenix docs on configuring AI providers
π‘Β Hint: Mention RunLLM in the thread for followups.
hey Philipp M. glad you are unblocked. Definitely can make the model listings dynamic. Curious what your setup is with Anthropic given that you are using the OPENAI client? We also have anthropic specifically as a provider (and also have it through bedrock). Thanks for the feedback! Just trying to streamline the experience.
