has anyone ever succeeded in adding phoenix to their nextjs app? I'm having issues adding instrumentation. I'm not sure which documentation to follow. I have seen https://github.com/Arize-ai/openinference/tree/main/js/examples/llama-index-express/frontend But it seems like it's not instrumenting the edge functions in NextJS itself, but it uses different backend nodejs to run the llm calls. This is where I'm having issues. I currently have basic LLM calls on NextJS edge endpoints, I'm not sure how to proceed
Hey Albert, a repro would be great. Are you using the AI SDK or llama index? Are you using RSCs? Still just getting our Nextjs instrumentation in place so would love to get your input
Thanks for the update. Yeah let us know. Would love to get your Nextjs app instrumented. Though Nextjs OTEL support is still under experimental so might take a small bit of ground work on our end.
https://github.com/albertpurnama/nextjs-phoenix Mikyo Here it is. I'm not sure whether I provide enough information in the README.md but please let me know if you need more context!
Thank you for making this open source!
At some point I want to be able to use RSC for this app. maybe Phoenix can even absorb this nextjs app for a sample use case for next.js + vercel ai sdk. I'd love to contribute, but tbh I don't know how, this is my first time trying opentelemetry. I have background in building web applications. Let me know where I can start!
Within branch potential-solution-1 I also tried using manual instrumentation https://github.com/albertpurnama/nextjs-phoenix/pull/1/files
Hey Albert P. sorry for the delay - busy week. I've made some preliminary changes but haven't tested them out yet. There are some nuances to OTel with NextJS and also the module resolution process (esm) that I haven't fully worked out yet. We actually got langchainjs spans out but we had to explicitly add our instrumentation to the chain inside the "server" block> I've made changes to the OpenAI instrumentor to allow for non commonjs instrumentation but haven't tried to wire it up to Nextjs. A bit new to all the new RSC stuff with next so appreciate the patience 😅 I do have a ticket assigned to me. I can post some of my findings there if that would help! https://github.com/Arize-ai/phoenix/issues/3199
Hey Mikyo thanks for replying, I know how difficult it is to keep up with all these new things coming up! I don't know how I can help but my DM is open here if you want to discuss some stuff, I do most of my things in NextJS so I should probably be able to help brainstorm some things on NextJS side (I am not affiliated with them in any way, I just use their framework most of the time). You can also reach me on Twitter DM if you need.
Albert P. I got an example working with NextJS with the caveat that I think Custom OTel seems to only really work with NextJS 13, not 14 (which has all the RSC stuff) You can force the tracing to work with nextjs 14 but you have to re-instrument on every "use server" block. I think this non-ideal and probably will have to flag it with the nextjs team when I get a chance. I'm guessing this isn't exactly what you were looking for - unless you are still on Next13? https://github.com/Arize-ai/openinference/pull/489
