To instrument your application when using Hugging Face models with Phoenix, you can use the `SmolagentsInstrumentor` from the `openinference-instrumentation-smolagents` package. This allows you to trace and visualize the steps and calls made by your agents. Here's a basic setup:
1. *Install Required Packages*: Ensure you have the necessary packages installed.
```
pip install openinference-instrumentation-smolagents smolagents
```
2. *Set Up Environment Variables*: Add your Hugging Face token as an environment variable.
```
import os
os.environ["HF_TOKEN"] = "<your_hf_token_value> "
```
3. *Initialize the Instrumentor*: Before your application code, initialize the `SmolagentsInstrumentor`.
```
from openinference.instrumentation.smolagents import SmolagentsInstrumentor
SmolagentsInstrumentor().instrument(tracer_provider=tracer_provider)
```
4. *Create and Run an Agent*: Use the `smolagents` library to create and run your Hugging Face model.
```
from smolagents import HfApiModel, ToolCallingAgent, ManagedAgent, DuckDuckGoSearchTool, VisitWebpageTool
model = HfApiModel()
agent = ToolCallingAgent(tools=[DuckDuckGoSearchTool(), VisitWebpageTool()], model=model)
managed_agent = ManagedAgent(agent=agent, name="managed_agent", description="This is an agent that can do web search.")
```
For more detailed instructions, you can refer to the <https://docs.arize.com/phoenix/tracing/integrations-tracing/hfsmolagents|Phoenix documentation on Hugging Face smolagents>.
Would you like more detail on any specific part of this process?
Sources:
- <https://docs.arize.com/phoenix/tracing/integrations-tracing/hfsmolagents|Phoenix documentation on Hugging Face smolagents>