hi ๐[private user], thanks for coming back here - i think the question is now well answered by you providing a visual example of how it's represented. so, the graph edges are inferred by the spans/traces but the hierachy is not (through setting parent_id / id).
๐[private user] thats basically just that doc page i linked, do you have any other sources to check ? how could i trust that you have more context than just that doc page?
in Arize AX, there's a nice agent flow view, which can be instrumented using contextmanagers and attributes. ( link: https://arize.com/docs/ax/observe/agents/implementing-agent-metadata-for-arize) what exatly does parent_id mean in this example? how would the output graph look like?
output:
=== Test 1: Register with HTTP transport ===
๐ญ OpenTelemetry Tracing Details ๐ญ
| Arize Project: test-minimal
| Span Processor: Multiple Span Processors
| Collector Endpoint: Multiple Span Exporters
| Transport: Multiple Span Exporters
| Transport Headers: Multiple Span Exporters
|
| Using a default SpanProcessor. `add_span_processor` will overwrite this default.
|
| `register` has set this TracerProvider as the global OpenTelemetry default.
| To disable this behavior, call `register` with `set_global_tracer_provider=False`.
โ
Register successful
Tracer provider: <arize.otel.otel.TracerProvider object at 0x105c01400>
=== Test 2: Add instrumentation ===
โ
Instrumentation successful
=== Test 3: Simple OpenAI call ===
DEBUG:openai._base_client:Request options: {'method': 'post', 'url': '/chat/completions', 'files': None, 'idempotency_key': 'stainless-python-retry-17bda8a2-272a-445d-95ba-0dd67805a5e2', 'json_data': {'messages': [{'role': 'user', 'content': "Say 'Hello from minimal test'"}], 'model': 'gpt-4o-mini', 'max_tokens': 10}}
DEBUG:openai._base_client:Sending HTTP Request: POST https://api.openai.com/v1/chat/completions
DEBUG:httpcore.connection:connect_tcp.started host='api.openai.com' port=443 local_address=None timeout=5.0 socket_options=None
DEBUG:httpcore.connection:connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x10a3e57f0>
DEBUG:httpcore.connection:start_tls.started ssl_context=<ssl.SSLContext object at 0x108ede3c0> server_hostname='api.openai.com' timeout=5.0
DEBUG:httpcore.connection:start_tls.complete return_value=<httpcore._backends.sync.SyncStream object at 0x10a339950>
DEBUG:httpcore.http11:send_request_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_headers.complete
DEBUG:httpcore.http11:send_request_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_body.complete
DEBUG:httpcore.http11:receive_response_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_headers.complete return_value=(b...)
INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
DEBUG:httpcore.http11:receive_response_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_body.complete
DEBUG:httpcore.http11:response_closed.started
DEBUG:httpcore.http11:response_closed.complete
DEBUG:openai._base_client:HTTP Response: POST https://api.openai.com/v1/chat/completions "200 OK" Headers([(...])
DEBUG:openai._base_client:request_id: req_fdd41df19a4a35a513dafaa5931b6963
โ
OpenAI call successful: Hello from minimal test!
=== Test complete ===
DEBUG:openai.agents:Shutting down trace provider
DEBUG:openai.agents:Shutting down trace processor <openinference.instrumentation.openai_agents._processor.OpenInferenceTracingProcessor object at 0x108f482f0>
DEBUG:httpcore.connection:close.started
DEBUG:httpcore.connection:close.complete
๐[private user] what else
script
import os
from dotenv import load_dotenv
load_dotenv()
print("Testing Arize tracing setup...")
print(f"ARIZE_SPACE_ID: {os.getenv('ARIZE_SPACE_ID')}")
print(f"ARIZE_API_KEY: {os.getenv('ARIZE_API_KEY')[:20]}..." if os.getenv('ARIZE_API_KEY') else "None")
# Test 1: Just register
print("\n=== Test 1: Register only ===")
try:
from arize.otel import register
tracer_provider = register(
space_id=os.getenv("ARIZE_SPACE_ID"),
api_key=os.getenv("ARIZE_API_KEY"),
project_name="test-minimal",
)
print("โ
Register successful")
print(f"Tracer provider: {tracer_provider}")
except Exception as e:
print(f"โ Register failed: {e}")
# Test 2: Add instrumentation
print("\n=== Test 2: Add instrumentation ===")
try:
from openinference.instrumentation.openai_agents import OpenAIAgentsInstrumentor
instrumentor = OpenAIAgentsInstrumentor()
instrumentor.instrument(tracer_provider=tracer_provider)
print("โ
Instrumentation successful")
except Exception as e:
print(f"โ Instrumentation failed: {e}")
# Test 3: Simple OpenAI call
print("\n=== Test 3: Simple OpenAI call ===")
try:
import openai
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Say 'Hello from minimal test'"}],
max_tokens=10
)
print(f"โ
OpenAI call successful: {response.choices[0].message.content}")
except Exception as e:
print(f"โ OpenAI call failed: {e}")
print("\n=== Test complete ===")outputs:
Testing Arize tracing setup...
ARIZE_SPACE_ID: U3B..
ARIZE_API_KEY: ak-f...
=== Test 1: Register only ===
๐ญ OpenTelemetry Tracing Details ๐ญ
| Arize Project: test-minimal
| Span Processor: BatchSpanProcessor
| Collector Endpoint: otlp.arize.com
| Transport: gRPC
| Transport Headers: {'authorization': '****', 'api_key': '****', 'arize-space-id': '****', 'space_id': '****', 'arize-interface': '****', 'user-agent': '****'}
|
| Using a default SpanProcessor. `add_span_processor` will overwrite this default.
|
| `register` has set this TracerProvider as the global OpenTelemetry default.
| To disable this behavior, call `register` with `set_global_tracer_provider=False`.
โ
Register successful
Tracer provider: <arize.otel.otel.TracerProvider object at 0x1009a5160>
=== Test 2: Add instrumentation ===
โ
Instrumentation successful
=== Test 3: Simple OpenAI call ===
โ
OpenAI call successful: Hello from minimal test!
=== Test complete ===still, no traces in the UI. ๐[private user] what else to try?
from arize.otel import register, Transport
from openinference.instrumentation.openai_agents import (
OpenAIAgentsInstrumentor,
)
import openai
tracer_provider = register(
space_id=os.getenv("ARIZE_SPACE_ID"),
api_key=os.getenv("ARIZE_API_KEY"),
project_name="joke-evaluation-langgraph",
endpoint="https://otlp.arize.com/v1/traces",
transport=Transport.HTTP,
log_to_console=True,
)
OpenAIAgentsInstrumentor().instrument(tracer_provider=tracer_provider)
self.client = openai
self.callbacks = []
outputs:
๐ญ OpenTelemetry Tracing Details ๐ญ
| Arize Project: joke-evaluation-langgraph
| Span Processor: Multiple Span Processors
| Collector Endpoint: Multiple Span Exporters
| Transport: Multiple Span Exporters
| Transport Headers: Multiple Span Exporters
|
| Using a default SpanProcessor. `add_span_processor` will overwrite this default.
|
| `register` has set this TracerProvider as the global OpenTelemetry default.
| To disable this behavior, call `register` with `set_global_tracer_provider=False`.
๐ญ LangGraph Joke Evaluation Agent Starting... ๐ญno further logs -- what else to try? api key and space id are correct. ๐[private user]
๐[private user]
kinda a big chang ethough
docs are randomly dropping the arize anyway or call the product arize and arize phoenix
