does the Phoenix LLM wrapper (class LLM(*, provider=None, model=None, client=None, initial_per_second_request_rate=None, **kwargs) ) support sending file attachments to models? I need to implement a custom LLM-as-judge ClassificationEvaluator with a PDF attachment.
For more context, I want to evaluate a task which uses gemini to extract text from a PDF. so the LLM judge (also using Gemini here) needs to see the PDF itself, not just extracted text.