Hi there, is there any tutorial or documentation present for using the Arize Phoenix tracing, evaluation, inferencing and retrieval using locally hosted LLMs like llama3 or mistral on Ollama or llamacpp?
Thanks!
Guide to Arize Phoenix with Local LLMs: Tutorials & Docs Needed | Arize AI Community