Hey, is it possible to use a self-hosted Open AI compatible LLM endpoint in the playground?
Can’t see any way to configure this. The only options listed are: Open AI, Azure Open AI, Anthropic and Google AI Studio.
Yes, it is possible to use a self-hosted OpenAI-compatible LLM endpoint in the Phoenix playground. You can set the OPENAI_BASE_URL environment variable in your configuration, such as in a docker-compose.yml file, to point to your self-hosted endpoint. This allows the playground to interact with your custom LLM endpoint as if it were an OpenAI endpoint, provided it follows the OpenAI API specifications.Would you like more detail on how to set this up?Sources:
- GitHub Issue #5836
💡Hint: Mention RunLLM in the thread for followups.
Yeah this is definitely on our radar but is not yet implemented, definitely throw a thumbs up on the linked issue above and we can get it prioritized 🙏
Seems like all that’s needed is just an extra config field to store OPENAI_BASE_URL in addition to OPENAI_API_KEY so could follow what’s done for api key
I wouldn't call it trivial as it spans a few components of the backend and into the query layer of the frontend, but at the same time I would never discourage someone from hacking on the codebase 😅