I want to use Phoenix to instrument usage of LLMs running on both (a) Ollama locally, and (b) on Amazon Bedrock. These models would be accessed from various sources, for example: using Open WebUI. Is there a way to wire in Phoenix to the LLM-side? I ask because I won't have access to all of the clients using these models.
Phoenix supports instrumenting LLMs running on both Ollama locally and Amazon Bedrock. For Ollama, you can use the LiteLLMModel to point to your local LLMs. For Amazon Bedrock, you can use the BedrockInstrumentor to instrument calls via the boto3 client.For Ollama:
1. Set up LiteLLMModel:
import osLBKS os.environ["OLLAMA_API_BASE"] = "[http://localhost:11434](http://localhost:11434)"
model = LiteLLMModel(model="ollama/llama3")
```LBKSLBKSFor Amazon Bedrock:
1. **Set up Bedrock Instrumentation**:
```LBKS from phoenix.otel import register
tracer_provider = register(project_name="my-llm-app")
```LBKSLBKS2. **Instrument boto3 client**:
```LBKS import boto3
from opentelemetry.instrumentation.boto3 import Boto3InstrumentorLBKS Boto3Instrumentor().instrument()
client = boto3.client('bedrock')
```LBKSLBKSFor both setups, you can stream traces to a locally running Phoenix server: LBKS```LBKSimport phoenix as px
session = px.launch_app()LBKS```LBKSLBKSFor more details, refer to the [Phoenix documentation](https://docs.arize.com/phoenix/api/evaluation-models) and [Bedrock integration guide](https://docs.arize.com/phoenix/tracing/integrations-tracing/bedrock).LBKSWould you like more detail on any specific part?LBKSSources:
- [Bedrock Integration Guide](https://docs.arize.com/phoenix/tracing/integrations-tracing/bedrock)
๐กย Hint: Mention RunLLM in the thread for followups.
This seems to require modification on the client side, which I don't have access to.
Again, I don't have control over the clients making use of my LLM server.... so that won't work.
Hmmm.... maybe I can put this in-between... https://github.com/BerriAI/litellm
Let the client connect to this proxy, and let this proxy hook in to Phoenix on the way to/from the LLM. What do you think?
