Hey everyone! It's me again, but this time I wanted to ask about evals. Do I need to connect to Vertex AI, to use Google models in LLM class? I tried the following and I get the error of unrecognized provider:
llm = LLM(model="gemini-2.5-flash", provider="google")No, you do not need to connect to Vertex AI to use Google models in the LLM class, but you must use the correct provider name. The provider should be "google-genai" (not "google") for Gemini models. Using "google" as the provider will result in an "unrecognized provider" error, as only specific provider names are supported by the LLM abstraction in Phoenix. To use Gemini models, set the provider to "google-genai" and ensure you have the required dependencies (e.g., google-genai> =1.0.0). Example: LBKSfrom phoenix.evals.llm import LLM llm = LLM(provider="google-genai", model="gemini-2.5-flash")LBKSSee Configuring the LLM and Google Gen AI Evals for details.Would you like more detail on authentication or usage?Sources:
- Google Gen AI Evals
- Configuring the LLM
馃挕聽Hint: Mention RunLLM in the thread for followups.
AttributeError
'GoogleGenAIModel' object has no attribute 'generate_classification'
Dustin N. Elizabeth H. do you know the answer?
hi try using litellm as the provider and a fully qualified provider/model name as the model parameter
