I'm having issues sending up traces from my typescript vercel ai-sdk app to cloud phoenix I've followed these two guides to the T but no luck https://arize.com/docs/phoenix/integrations/vercel/vercel-ai-sdk-tracing-js and https://arize.com/docs/phoenix/tracing/llm-traces-1/quickstart-tracing-ts my instrumentation.ts looks like this
import { registerOTel } from '@vercel/otel';
import * as Sentry from '@sentry/nextjs';
// Phoenix/Arize imports
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import {
isOpenInferenceSpan,
OpenInferenceBatchSpanProcessor
} from "@arizeai/openinference-vercel";
import { SEMRESATTRS_PROJECT_NAME } from "@arizeai/openinference-semantic-conventions";
export async function register() {
// Phoenix configuration
const PHOENIX_COLLECTOR_ENDPOINT = process.env.PHOENIX_COLLECTOR_ENDPOINT;
const SERVICE_NAME = 'ai-chatbot';
// Create Phoenix span processor if endpoint is configured
let phoenixSpanProcessor: OpenInferenceBatchSpanProcessor | undefined;
if (PHOENIX_COLLECTOR_ENDPOINT && process.env.PHOENIX_API_KEY) {
phoenixSpanProcessor = new OpenInferenceBatchSpanProcessor({
exporter: new OTLPTraceExporter({
headers: {
// API key if you're sending it to Phoenix
api_key: process.env["PHOENIX_API_KEY"] ,
},
url:
process.env["PHOENIX_COLLECTOR_ENDPOINT"] ||
"https://app.phoenix.arize.com/v1/traces",
}),
spanFilter: (span) => {
// Only export spans that are OpenInference to remove non-generative spans
// This should be removed if you want to export all spans
console.log('[DEBUG]span', span);
return isOpenInferenceSpan(span);
},
})
}
// Register OpenTelemetry with Vercel's setup, adding Phoenix processor
registerOTel({
serviceName: SERVICE_NAME,
// Add Phoenix span processor alongside Vercel's default processors
spanProcessors: phoenixSpanProcessor ? [phoenixSpanProcessor as any] : undefined,
// Override resource attributes to include Phoenix-specific ones
// attributes: {
// [SEMRESATTRS_PROJECT_NAME]: SERVICE_NAME,
// }
});
if (process.env.NEXT_RUNTIME === 'nodejs') {
await import('./sentry.server.config');
}
if (process.env.NEXT_RUNTIME === 'edge') {
await import('./sentry.edge.config');
}
}
export const onRequestError = Sentry.captureRequestError;
i've added a console.log in the span filter and am seeing those logs come through but no traces in phoenix online i have set these two variables
PHOENIX_API_KEY=''
PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/v1/traces"
and i call streamText with experimental_telemetry: {enabled:true} these are my otel + openinference packages
@arizeai/openinference-semantic-conventions 2.1.0
@arizeai/openinference-vercel 2.2.2
@vercel/otel 1.13.0
@opentelemetry/api 1.8.0
@opentelemetry/api-logs 0.46.0
@opentelemetry/exporter-trace-otlp-proto 0.46.0
@opentelemetry/instrumentation 0.46.0
@opentelemetry/resources 1.19.0
@opentelemetry/sdk-logs 0.46.0
@opentelemetry/sdk-metrics 1.19.0
@opentelemetry/sdk-trace-base 1.19.0
@opentelemetry/sdk-trace-node 1.19.0
@opentelemetry/semantic-conventions 1.19.0not sure where i'm going wrong since the span filter is logging
If your span filter is logging but traces are not appearing in Phoenix Cloud, the most common issues are: (1) incorrect API key or endpoint, (2) missing required Phoenix resource attributes, or (3) network/exporter errors. Your code and environment variable setup generally match the official Phoenix Vercel AI SDK tracing guide and TypeScript tracing quickstart, but there are a few things to check: 1. API Key and Endpoint: Ensure PHOENIX_API_KEY is set to your actual Phoenix Cloud API key (not an empty string), and PHOENIX_COLLECTOR_ENDPOINT should be https://app.phoenix.arize.com/v1/traces (no angle brackets).
2. Resource Attributes: You should set the Phoenix project name using the attributes option in registerOTel, e.g., attributes: { [SEMRESATTRS_PROJECT_NAME]: SERVICE_NAME }.
3. Exporter Errors: If spans are being created but not exported, check for errors in the browser/server logs related to the OTLP exporter.
4. Telemetry Enablement: Ensure you are calling your LLM with experimental_telemetry: { isEnabled: true } (not just enabled: true).
5. BatchSpanProcessor Flushing: The BatchSpanProcessor may not flush before the process exits—try using the SimpleSpanProcessor for debugging, or ensure your app stays alive long enough for the batch to export.Example minimal working setup (from the official docs): LBKSimport { registerOTel } from "@vercel/otel"; import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto"; import { isOpenInferenceSpan, OpenInferenceSimpleSpanProcessor } from "@arizeai/openinference-vercel"; import { SEMRESATTRS_PROJECT_NAME } from "@arizeai/openinference-semantic-conventions";LBKSexport function register() { registerOTel({ serviceName: "phoenix-next-app", attributes: { [SEMRESATTRS_PROJECT_NAME]: "your-next-app", }, spanProcessors: [ new OpenInferenceSimpleSpanProcessor({ exporter: new OTLPTraceExporter({ headers: { api_key: process.env["PHOENIX_API_KEY"] }, url: process.env["PHOENIX_COLLECTOR_ENDPOINT"] || "https://app.phoenix.arize.com/v1/traces", }), spanFilter: isOpenInferenceSpan, }), ], }); }LBKSIf you still see logs in the span filter but no traces in Phoenix, double-check the API key, endpoint, and look for exporter errors. Would you like a step-by-step troubleshooting checklist?Sources:
- Vercel AI SDK Tracing (JS) | Phoenix
- TypeScript tracing quickstart
💡 Hint: Mention RunLLM in the thread for followups.
RunLLM some more logs {"stack":"OTLPExporterError: Unauthorized\n at new OTLPExporterError (.next/server/chunks/node_modules__pnpm_5386cab4._.js:23120:28)\n at IncomingMessage.<anonymous> (.next/server/chunks/node_modules__pnpm_5386cab4._.js:23254:37)\n at IncomingMessage.emit (node:events:536:35)\n at endReadableNT (node:internal/streams/readable:1698:12)\n at process.processTicksAndRejections (node:internal/process/task_queues:82:21)","message":"Unauthorized","name":"OTLPExporterError","code":"401"} {"stack":"OTLPExporterError: Unauthorized\n at new OTLPExporterError (.next/server/chunks/node_modules__pnpm_5386cab4._.js:23120:28)\n at IncomingMessage.<anonymous> (.next/server/chunks/node_modules__pnpm_5386cab4._.js:23254:37)\n at IncomingMessage.emit (node:events:536:35)\n at endReadableNT (node:internal/streams/readable:1698:12)\n at process.processTicksAndRejections (node:internal/process/task_queues:82:21)","message":"Unauthorized","name":"OTLPExporterError","code":"401"} is what i'm seeing after adding diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG); and i'm seeing some duplicate registration of "trace" error line as well at the beginning
The OTLPExporterError: Unauthorized with code 401 means your Phoenix Cloud API key is missing or incorrect. You must set PHOENIX_API_KEY to a valid API key from your Phoenix Cloud account—an empty string or placeholder will not work. Also, ensure your PHOENIX_COLLECTOR_ENDPOINT is set to https://app.phoenix.arize.com/v1/traces without angle brackets.
The duplicate "trace" registration warning is usually harmless but can occur if the OpenTelemetry provider is registered more than once; ensure your instrumentation setup runs only once per process.
Would you like detailed steps to resolve the API key issue?Sources:
- Vercel AI SDK Tracing (JS) | Phoenix
- TypeScript tracing quickstart
- Authentication | Phoenix
💡 Hint: Mention RunLLM in the thread for followups.
RunLLM i've gotten the system api key from phoenix app in the cloud and i have a log that verifies that it's actually correct once the app loads my collector url is https://app.phoenix.arize.com/v1/traces
If you are still seeing OTLPExporterError: Unauthorized (401) even with the correct system API key and collector URL, double-check that the API key is being sent in the correct header. For Phoenix Cloud, the header should be api_key, not Authorization. Your OTLP exporter config should look like: LBKSexporter: new OTLPTraceExporter({ headers: { api_key: process.env["PHOENIX_API_KEY"] }, url: "https://app.phoenix.arize.com/v1/traces", }),LBKSAlso, ensure there are no extra spaces or invisible characters in your API key or endpoint. If the error persists, try restarting your app and verify that the environment variable is available at runtime (not just at build time).Would you like a checklist to debug environment variable loading and header formatting?Sources:
- Vercel AI SDK Tracing (JS) | Phoenix
- TypeScript tracing quickstart
💡 Hint: Mention RunLLM in the thread for followups.
RunLLM that's literally what i'm doing phoenixSpanProcessor = new OpenInferenceSimpleSpanProcessor({ exporter: new OTLPTraceExporter({ headers: { api_key: process.env["PHOENIX_API_KEY"] }, url: process.env["PHOENIX_COLLECTOR_ENDPOINT"] }), // spanFilter: (span) => { // // Only export spans that are OpenInference to remove non-generative spans // // This should be removed if you want to export all spans // // console.log('[DEBUG]span', span); // return isOpenInferenceSpan(span); // }, }) }
If you are still getting a 401 Unauthorized error despite using the correct API key and endpoint, the most likely causes are:
1. The API key is not being injected into the runtime environment where the OTLP exporter runs (e.g., Vercel serverless functions may require secrets to be set in the Vercel dashboard, not just .env files).
2. The API key or endpoint has invisible characters or whitespace—try logging their values with delimiters to confirm.
3. The OTLP exporter is not picking up the environment variable at runtime (e.g., due to build-time vs. runtime variable resolution).This exact setup is supported and should work if the API key is present in the api_key header and the endpoint is correct, as shown in the official Phoenix docs.
Would you like step-by-step instructions to debug environment variable injection in your deployment environment?Sources:
- Vercel AI SDK Tracing (JS) | Phoenix
💡 Hint: Mention RunLLM in the thread for followups.
If you have confirmed via console log that process.env.PHOENIX_API_KEY is correct immediately before OTLPTraceExporter is constructed, but you still get a 401 error, the issue is almost certainly with how the header is being sent or interpreted by Phoenix Cloud.Checklist:
1. No Angle Brackets: Ensure your endpoint is exactly https://app.phoenix.arize.com/v1/traces (no < > ).
2. No Extra Spaces: Log the header as { api_key: process.env["PHOENIX_API_KEY"] } right before passing to the exporter, and check for whitespace or invisible characters.
3. Vercel Environment: If deploying on Vercel, ensure the secret is set in the Vercel dashboard and not just in .env (Vercel does not load .env for serverless by default).
4. No Build-Time Substitution: Make sure the value is available at runtime, not just at build time.
5. No Multiple Exporters: Only one exporter should be sending to Phoenix; multiple registrations can cause issues.If all of the above are correct and you still get 401, try regenerating the API key in Phoenix Cloud and updating your secret.Would you like a minimal working example or more troubleshooting steps?Sources:
- Vercel AI SDK Tracing (JS) | Phoenix
- TypeScript tracing quickstart
💡 Hint: Mention RunLLM in the thread for followups.
Yes, there are relevant Slack threads discussing similar issues with TypeScript-based instrumentation and Phoenix. Here are the most pertinent ones: 1. Slack thread: TypeScript telemetry setup and troubleshooting – This thread covers issues with setting up telemetry in TypeScript, including package versions and deprecation warnings. 2. Slack thread: Duplicate registration and environment variable issues – This thread discusses problems with TypeScript/Node.js instrumentation, including errors related to object extensibility and registration order.Would you like a summary of the key insights from these threads?Sources: - TypeScript telemetry setup and troubleshooting - Duplicate registration and environment variable issues
💡 Hint: Mention RunLLM in the thread for followups.
Super sorry about this Swaraj R. I think our docs are out of date here. Instead of a header called api_key, can you create one called "Authorization": `Bearer ${apiKey}`
I tried that too actually just now
What is the url of your phoenix space
In your browser
