Hello 👋 I need help with LLM tracing for a simple RAG app.
i followed all instructions to get started with LLM tracing here: https://docs.arize.com/arize/llm-tracing/quickstart-llm
it worked well and i was able to see that a project was automatically created on the phoenix console and i was able to view tracking for the simple chat completion call to OpenAI.
now i built a rag interface and did whatever was necessary to trace. however, i'm not able to view traces on phoenix console. code is here (i've just removed the .env file that contains OpenAI key).
can you please let me know what i'm doing wrong? Thank you!
Arize SaaS, sorry for the miscommunication
No worries, thanks for clarifying!
Are you trying to use a library or particular package?
here's everything i imported
import os
import openai
from dotenv import load_dotenv
from flask import Flask, request, jsonify, render_template
from sentence_transformers import SentenceTransformer, util
from bs4 import BeautifulSoup
import requests
from arize.otel import register
from openinference.instrumentation.openai import OpenAIInstrumentorDo you mind sending a full code snippet to reproduce?
it still didn't work. here's what i changed
# Import necessary dependencies
import os
import openai
from dotenv import load_dotenv
from flask import Flask, request, jsonify, render_template
from sentence_transformers import SentenceTransformer, util
from bs4 import BeautifulSoup
import requests
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
load_dotenv()
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
openai.api_key = OPENAI_API_KEY
from arize.otel import register
# OpenTelemetry setup
tracer_provider = register(
space_id="X==", # Replace with actual Space ID
api_key="X", # Replace with actual API Key
project_name="Arize_Nitin", # Replace with your project name
)
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
# Initialize Flask app
app = Flask(__name__)
Do you see traces being logged to stdout?
the application is working fine
ConsoleSpanExporter will print to stdout
You should see some printed traces in your logs each time a span is created.
