New LLM decorators in OpenInference ๐
Tag a function with @tracer.llm to automatically capture it as an OpenTelemetry span.
- Automatically parses input and output messages
- Supports sync, async coroutine, sync generator, and async generator functions.
- Comes in decorator or context manager flavors, or patch your LLM client method directly.
More examples here
This is the next iteration in a bunch of work the Phoenix team has been doing to power up OpenInference. There's been a wave of new instrumentors (including MCP, more to come there soon!), along with a whole set of tools to make manual instrumentation easier. As always, let us know if you have any feedback, or new wishlist items!