Hello! I am trying to run Pheonix Instrumentation for an Azure Function that calls an AzureOpenAI LLM accessed via the openai library. However, when I tried to run Azure Function locally via Visual Studio Code (i.e. func host start), the Pheonix server does not receive any traces. Interestingly, when I tried to run the code outside of the Azure Function (i.e. outside the http_trigger that has @app.route decorator), traces are being sent to Pheonix server. I have even stripped down the code to only the following, but still same behavior. Is this behavior expected? I have a feeling the Azure Function wrapper is stopping the telemetry requests, but I couldnt find anything online. Any help appreciated! I also noticed that sampling_result in trace/__init__.py is False when executed inside Azure Function, but True outside. Not sure why
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from openinference.instrumentation.openai import OpenAIInstrumentor
import azure.functions as func
app = func.FunctionApp(http_auth_level=func.AuthLevel.FUNCTION)
@app.route(route="http_trigger")
def http_trigger(req: func.HttpRequest):
endpoint = "http://localhost:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
trace_api.set_tracer_provider(tracer_provider)
trace_api.get_tracer(__name__).start_span("test12321985").end()
return func.HttpResponse(
'hi',
status_code=200,
)
what if you replace this
trace_api.set_tracer_provider(tracer_provider)
trace_api.get_tracer(__name__).start_span("test12321985").end()with this
tracer_provider.get_tracer(__name__).start_span("test12321985").end()this is exactly what my teammate was seeing, only azure functions app wont' send traces for him locally, (production azure functions works, locally it works for the rest of the team though, and fast api servers for us also work) I reported here but haven't been able to debug more https://arize-ai.slack.com/archives/C04R3GXC8HK/p1730229020117559
cursed by phoenix gods 馃槄
which pypi package do you install for this?
i meant for azure
For this Azure Function, Python 3.9 with arize-phoenix==5.5.2 arize-phoenix-evals==0.16.1 arize-phoenix-otel==0.5.1 openinference-instrumentation==0.1.18 openinference-instrumentation-langchain==0.1.28 openinference-instrumentation-openai==0.1.14 openinference-semantic-conventions==0.1.10 opentelemetry-api==1.27.0 opentelemetry-exporter-otlp==1.27.0 opentelemetry-exporter-otlp-proto-common==1.27.0 opentelemetry-exporter-otlp-proto-grpc==1.27.0 opentelemetry-exporter-otlp-proto-http==1.27.0 opentelemetry-instrumentation==0.48b0 opentelemetry-instrumentation-httpx==0.48b0 opentelemetry-proto==1.27.0 opentelemetry-sdk==1.27.0 opentelemetry-semantic-conventions==0.48b0 opentelemetry-util-http==0.48b0
and for Azure, azure-functions==1.20.0
ok got it
azure-core==1.31.0 FYI
hm, yea it鈥檚 not clear to me why it doesn鈥檛 work, and i don鈥檛 have azure access to replicate it
have you looked at these steps here?
