Tracing Options for Local Models in Arize Phoenix Support
Hey Soubhik M. we can help you over in Phoenix Support - this channel is for users of app.arize.com. But to answer your question here, the tracing works on top of frameworks and client SDKs so if you are using one of those (LlamaIndex, LangChain, OpenAI SDK, DSPy) you can easily trace even local models. Here's an example using DSPy with Ollama. https://github.com/diicellman/dspy-rag-fastapi If you are directly using ollama or llamacpp, we don't have auto-instrumentors for those yet (tickets welcome!), but you can trace them using the tracing APIs if you so choose (https://docs.arize.com/phoenix/tracing/how-to-tracing/manual-instrumentation)
