Hi, We've been running some experiments with OpenAI's LLMs and have seen great results. So far, we've followed the recommended examples for instrumentation using:
tracer_provider = register()
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)Now, we're looking to experiment with other LLMs—specifically models from Hugging Face. Could you advise on the recommended way to instrument our application when using Hugging Face models?
To instrument your application when using Hugging Face models with Phoenix, you can use the SmolagentsInstrumentor from the openinference-instrumentation-smolagents package. This allows you to trace and visualize the steps and calls made by your agents. Here's a basic setup: 1. Install Required Packages: Ensure you have the necessary packages installed.
```LBKSLBKS2. **Set Up Environment Variables**: Add your Hugging Face token as an environment variable.
```LBKS import os
os.environ["HF_TOKEN"] = "<your_hf_token_value> "
```LBKSLBKS3. **Initialize the Instrumentor**: Before your application code, initialize the `SmolagentsInstrumentor`.
```LBKS from openinference.instrumentation.smolagents import SmolagentsInstrumentor
SmolagentsInstrumentor().instrument(tracer_provider=tracer_provider)
```LBKSLBKS4. **Create and Run an Agent**: Use the `smolagents` library to create and run your Hugging Face model.
```LBKS from smolagents import HfApiModel, ToolCallingAgent, ManagedAgent, DuckDuckGoSearchTool, VisitWebpageToolLBKS model = HfApiModel()
agent = ToolCallingAgent(tools=[DuckDuckGoSearchTool(), VisitWebpageTool()], model=model)
managed_agent = ManagedAgent(agent=agent, name="managed_agent", description="This is an agent that can do web search.")
```LBKSLBKSFor more detailed instructions, you can refer to the [Phoenix documentation on Hugging Face smolagents](https://docs.arize.com/phoenix/tracing/integrations-tracing/hfsmolagents).LBKSWould you like more detail on any specific part of this process?LBKSSources:
- [Phoenix documentation on Hugging Face smolagents](https://docs.arize.com/phoenix/tracing/integrations-tracing/hfsmolagents)
💡 Hint: Mention RunLLM in the thread for followups.
To instrument your application with Hugging Face models using Phoenix, you can use the SmolagentsInstrumentor from the OpenInference library. This allows you to trace and visualize the interactions with Hugging Face models. Here's a concise guide: 1. Install Required Packages: Ensure you have the necessary packages installed.
```LBKSLBKS2. **Set Up Environment Variables**: Add your Hugging Face token as an environment variable.
```LBKS import os
os.environ["HF_TOKEN"] = "<your_hf_token_value> "
```LBKSLBKS3. **Initialize the Instrumentor**: Use the `SmolagentsInstrumentor` to instrument your application.
```LBKS from openinference.instrumentation.smolagents import SmolagentsInstrumentor
SmolagentsInstrumentor().instrument(tracer_provider=tracer_provider)
```LBKSLBKS4. **Create and Run Agents**: Define your Hugging Face model and agents, and run them to send traces to Phoenix.
```LBKS from smolagents import HfApiModel, ToolCallingAgent, ManagedAgent
model = HfApiModel()
agent = ToolCallingAgent(model=model)
managed_agent = ManagedAgent(agent=agent, name="managed_agent")
managed_agent.run("Your query here")
```LBKSLBKSWould you like more detail on any specific part of this process?LBKSSources:
- [GitHub - Smolagents Tracing Documentation](https://github.com/Arize-ai/phoenix/blob/main/docs/tracing/integrations-tracing/hfsmolagents.md)
- [Phoenix Tracing with Smolagents](https://docs.arize.com/phoenix/tracing/integrations-tracing/hfsmolagents#P%3A4)
💡 Hint: Mention RunLLM in the thread for followups.
