Thank you
I checked LiteLLM earlier but that will send the hits to the actual LLM provider, right? I want to use self-hosted models as evaluators.
Hi Xander S.! I am not looking for playground code but want to setup batch evaluation jobs using LLM evaluators that are hosted within my GCP project.
This will help in running generations on the fly too for certain models.
Yes, possible workarounds for custom endpoints would be helpful since it will help meet privacy and security related compliances.
Hello, does Phoenix support choosing LLM-as-judges from models in GCP Model Garden or models deployed on endpoints in a GCP project?