Hi, I'm using the below code snippet to instrument my application, I have 2 questions for it:
How can I instrument only required openai call, not all of it.
Is there any way to push evaluation using the same snippet?
code snippet:
import openai
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from openinference.instrumentation.openai import OpenAIInstrumentor
endpoint = "https://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint, certificate_file=r"path/to/cacert.pem")))
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
if __name__ == "__main__":
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4-32k",
messages=[{"role": "user", "content": "Write a haiku."}],
max_tokens=20,
)
print(response.choices[0].message.content)
1. Instrument Only Required OpenAI Calls:
Use suppress_tracing to selectively disable tracing for specific OpenAI API calls. Wrap the calls you don't want to trace within the suppress_tracing context manager.2. Push Evaluation Using the Same Snippet:
To push evaluation, you can add custom attributes to the spans or create new spans for evaluation metrics.Would you like more detail on either of these points?Sources:
- Slack Conversation
- GitHub Issue #2340
💡 Hint: Mention RunLLM in the thread for followups.
RunLLM how do I trace my custom retriever span using the same api?
To trace your custom retriever span using the same API, you can manually create and manage spans within your retriever's logic. Here's an example of how to do this: 1. Setup Tracer: Ensure you have the tracer setup as shown in your initial code snippet.
2. Create Custom Span: Use the start_as_current_span method to create a span for your custom retriever.Here's an example: ```import openai
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from openinference.instrumentation.openai import OpenAIInstrumentorendpoint = "[https://127.0.0.1::6006/v1/traces](https:://127.0.0.1:6006/v1/traces)"tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint, certificate_file=r"path/to/cacert.pem")))OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
tracer = trace.get_tracer(name)def custom_retriever(query):
with tracer.start_as_current_span("custom_retriever_span") as span:
span.set_attribute("query", query)
# Your custom retriever logic here
result = "retrieved data" # Replace with actual retrieval logic
span.set_attribute("result", result)
return resultif name == "main":
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4-32k",
messages=[{"role": "user", "content": "Write a haiku."}],
max_tokens=20,
)
print(response.choices[0].message.content)
# Call custom retriever
query = "example query"
retrieved_data = custom_retriever(query)
print(retrieved_data)LBKS```LBKSLBKSThis code sets up a custom span for the retriever and adds attributes to it.LBKSWould you like more detail on any specific part of this process?LBKSSources:
💡 Hint: Mention RunLLM in the thread for followups.
hi Anuraag T.! It sounds like you want to use manual instrumentation with possibly sending an annotation/feedback within your application. We have a couple examples that show you how to do this:
here's a full stack example https://github.com/Arize-ai/phoenix/tree/main/examples/manually-instrumented-chatbot
and here's an example within a notebook https://github.com/Arize-ai/phoenix/blob/main/tutorials/human_feedback/chatbot_with_human_feedback.ipynb
In both of these examples we go over how to wire up your application to send feedback (which can be an evaluation as well) to phoenix directly alongside a manually instrumented span
Hi, Dustin N. Is there any documentation for v1 apis?
yeah! if you go to <server_endpoint>/docs you should get the OpenAPI spec for our rest API
is it possible to pass ssl certificate file in phoenix.Client?
