Hi we are building the AI planning extension in vs code in typescript. We wanted to use the phoenix tracing in decorator format and we tried almost everything available on the docs and not able to send traces on ui. does phoenix officially does not provide decorator based tracing in TS does it not provide auto instrumentation in TS. Best Ayaan
Phoenix does not provide decorator-based tracing in TypeScript; decorators for tracing are only available in Python. In TypeScript, Phoenix supports tracing via manual or automatic instrumentation using the OpenTelemetry SDK and OpenInference instrumentors, but not via decorators or annotation syntax as in Python. You must set up a TracerProvider and register instrumentations for supported libraries to emit traces to Phoenix鈥攕ee the official TypeScript tracing quickstart and instrumentation setup.Auto-instrumentation is supported in TS/JS for certain libraries (e.g., OpenAI, LangChain.js) via OpenInference instrumentors, but you must explicitly register these instrumentors in your code; there is no one-line decorator or annotation approach. For example, you can use OpenAIInstrumentation with OpenTelemetry, but you cannot use a decorator to trace arbitrary functions or methods in TypeScript as you can in Python. See this example for how to set up auto-instrumentation in TS.Would you like a step-by-step example of how to set up tracing in your TypeScript VS Code extension?Sources: - https://arize.com/docs/phoenix/tracing/llm-traces-1/quickstart-tracing-ts#overview - https://github.com/Arize-ai/phoenix/blob/main/docs/tracing/how-to-tracing/setup-tracing/javascript.md - https://github.com/Arize-ai/phoenix/issues/9236
馃挕聽Hint: Mention RunLLM in the thread for followups.
we tried this already it does not work.
hey Ayaan K. thanks for the feedback. Excited you are using JS instrumentation with Phoenix. Decorators for TS makes sense as long as it's an opt-in. I believe it requires a flag in tsconfig? In any case I filed an issue for the team. Please upvote or add feedback if you feel like it! https://github.com/Arize-ai/openinference/issues/2228
We also need support for decorators in TypeScript. We have been similarly using Langsmith, using their traceable method that allows ingestion of different types like chain, tool, and LLM, etc, and also constructs the call tree identical to how it's shown on your website (probably using Python). We have been facing issues with Langsmith at our current production scale and are looking for alternative providers. We feel that Phoenix is the closest option. By the way, Langsmith also offers an easy-to-use take to playground option for LLM calls, where the playground pre-fills the entire conversation with other settings, such as model and tools. Does Phoenix have a similar support?
By the way, Langsmith also offers an easy-to-use take to playground option for LLM calls, where the playground pre-fills the entire conversation with other settings, such as model and tools. Does Phoenix have a similar support?
Yup! We have span replay https://arize.com/docs/phoenix/prompt-engineering/overview-prompts/span-replay
why there is not native supports to some llms in TS
Just time and resources. Definitely file us an issue on GitHub!
