Troubleshooting FastAPI Tracing Issues with Phoenix in Production
Hi Phoenix Team, I've been using Phoenix locally for tracing in a FastAPI application, and it's been working great. Specifically, I'm using arize-phoenix~=4.35.0 with the following setup:
#python
import json
from typing import Any
from litellm import completion
from app.src.logging import logger
from app.src.llm_tools import TOOLS_DEF_LIST, RETRIEVE_FAQS, TOOLS_MAPPING
from app.src.generate_llm_response.llm_utils.config import LLM_MODEL
# Tracing the LLM calls
import phoenix as px
from phoenix.trace.openai import OpenAIInstrumentor
# Initialize OpenAI auto-instrumentation
OpenAIInstrumentor().instrument()
session = px.launch_app()
async def call_llm(
messages: list[dict[str, str]], use_tools: bool = False
) -> dict[str, Any]:
try:
# LLM call code...Now, we're planning to move this application into production, and to improve performance and scalability, I wanted to switch to using the hosted Phoenix solution. Following the documentation, I modified my code as follows:
#python
import os
from phoenix.otel import register
from openinference.instrumentation.litellm import LiteLLMInstrumentor
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
# Phoenix API Key
PHOENIX_API_KEY = "66dd...redacted..."
os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={PHOENIX_API_KEY}"
# Configure the Phoenix tracer
tracer_provider = register(
project_name="default",
endpoint="https://app.phoenix.arize.com/v1/traces",
)
# Instrument LiteLLM
LiteLLMInstrumentor().instrument(tracer_provider=tracer_provider)
# Instrument FastAPI for context propagation
FastAPIInstrumentor.instrument_app(app)
...
async def call_llm(
messages: list[dict[str, str]], use_tools: bool = False
) -> dict[str, Any]:
try:
# LLM call code...
I also tried placing the instrumentation code in main.py where the FastAPI app is defined, but tracing still does not work. *Issue:* Despite following the documentation and instrumenting both LiteLLM and FastAPI, I'm not seeing any traces appear in the Phoenix dashboard when running the FastAPI application. The application runs without errors, and LLM calls are successful, but tracing data isn't visible in the dashboard. When I run similar code in a Jupyter notebook, the tracing works correctly, and I can see the traces in Phoenix. This leads me to believe the issue might be related to context propagation in the asynchronous FastAPI application.
