I am facing this issue recently. I was able to push traces using same code a while ago but now it's giving me the following issues. Any ideas? Transient error StatusCode.DEADLINE_EXCEEDED encountered while exporting traces to otlp.arize.com, retrying in 1s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 2s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 4s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 8s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 16s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 32s.
Hi Usman, sorry you're experiencing problems. I did notice some internal issues occuring a bit before your message that may be related. Are you able to run your code again and see if you're still seeing those errors?
Thanks David. I was able to run the code subsequently and there were no issues.
Great to hear! Don鈥檛 hesitate to let us know if anything else comes up
Hey David. Ran into same issue again today. : Transient error StatusCode.DEADLINE_EXCEEDED encountered while exporting traces to otlp.arize.com, retrying in 1s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 2s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 4s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 8s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 16s. Transient error StatusCode.UNAVAILABLE encountered while exporting traces to otlp.arize.com, retrying in 32s.
Is it something I should be concerned about?
Hi Usman, sorry this is still happening. Are you able to share the code you're running?
I will dig in on my end, curious if you're able to unblock by running it again, as it does seem like temporary unavailability. Though the fact that you've hit it again makes me suspicious of something more
I'm still unable to get it working atm Sharing code in a moment
def setup_arize_client() -> trace_api.Tracer:
# Set the Space and API keys as headers for authentication
headers = f"space_key={ARIZE_SPACE_KEY},api_key={ARIZE_API_KEY}"
os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS'] = headers
# Set resource attributes for the name and version for your application
trace_attributes = {
"model_id": TRACING_PROJECT_NAME, # This is how your model will show up in Arize
"model_version": MODEL_VERSION, # You can filter your spans by model version in Arize
}
span_exporter = OTLPSpanExporter(endpoint=ARIZE_ENDPOINT)
tracer_provider = trace_sdk.TracerProvider(
resource=Resource(attributes=trace_attributes)
)
tracer_provider.add_span_processor(
SimpleSpanProcessor(
span_exporter=span_exporter
)
)
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
tracer = trace_api.get_tracer(__name__)
OpenAIInstrumentor().instrument()
return tracer
async def execute_in_span(span_name: str, execute_operation: callable, operation_params: dict, tracer: trace_api.Tracer,
additional_attributes: dict = None, send_span: bool = False) -> Any:
with tracer.start_as_current_span(span_name) as span:
# Filter out sensitive keys from operation parameters before logging them as span attributes
operation_params_filtered = {k: v for k, v in operation_params.items() if
k not in ['tracer']}
span_attributes = {
SpanAttributes.INPUT_VALUE: {
'input': {**operation_params_filtered}
}
}
# Merge additional attributes if provided
if additional_attributes:
span_attributes.update(additional_attributes)
# Inject the current span into operation parameters if requested
if send_span:
operation_params['span'] = span
set_span_attributes(span, span_attributes)
add_span_event(span, f"Starting the {span_name.lower()} process.")
start_time = perf_counter()
result = await execute_operation(**operation_params)
execution_time = perf_counter() - start_time
add_span_event(span, f"{span_name} process completed in {execution_time:.2f} seconds.")
# Capture the operation's result as the span's output value
output_attributes = {
SpanAttributes.OUTPUT_VALUE: {'output': result}
}
set_span_attributes(span, output_attributes)
return result
def add_span_event(span: Span, message: str) -> None:
span.add_event(message)
def set_span_attributes(span: Span, attributes: Dict[Any, Any]):
for attr, value in attributes.items():
# Automatically assign MIME type for input and output values if not explicitly provided
if attr.endswith(".value") and (
SpanAttributes.INPUT_MIME_TYPE not in attributes and SpanAttributes.OUTPUT_MIME_TYPE not in attributes):
mime_type_key = SpanAttributes.INPUT_MIME_TYPE if 'input' in attr else SpanAttributes.OUTPUT_MIME_TYPE
span.set_attribute(mime_type_key, OpenInferenceMimeTypeValues.JSON.value)
span.set_attribute(attr, json.dumps(value) if isinstance(value, Dict) else value)
# Set the span kind to 'chain' for all spans processed by this function
span.set_attribute(SpanAttributes.OPENINFERENCE_SPAN_KIND, OpenInferenceSpanKindValues.CHAIN.value)
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from opentelemetry import trace as trace_api
from openinference.semconv.trace import OpenInferenceSpanKindValues, SpanAttributes, OpenInferenceMimeTypeValues
from opentelemetry.trace import Span
ARIZE_SPACE_KEY = os.environ["ARIZE_SPACE_KEY"]
ARIZE_API_KEY = os.environ["ARIZE_API_KEY"]
TRACING_PROJECT_NAME = "proj-name"
MODEL_VERSION = "1.0"
ARIZE_ENDPOINT = "https://otlp.arize.com/v1"
execute_in_span is called on any function which needs tracing
Great thank you, will take a look and keep you posted
Thanks David. Looking forward to it.
Hmmm I'm having a hard time reproducing the transient error you're getting. When running your code with a simple llm app I've been able to successfully get traces into the platform. Can I ask about the nature of your application. Would you be expecting it to produce a high volume of traces?
The code is pretty straightforward David. I'm using AzureOpenai for tracing. The quantity of logs isn't too much at the moment. I'm just trying it out.
David Monical any guesses? could it be due to openai version? Is there a goegraphical restriction on the service? But it was working just fine some days ago
Hi Usman, appreciate the patience as we look into this. I would not expect it to be the OpenAI version, though if something changed there and you want to test that hypothesis feel free to do so. There should be no geographical restriction on the service that I know of. I'm following up with the team on this, and will keep you updated. Sorry again for the difficulties
