Hi, we are trying to run Arize Phoenix with Langflow both self-hosted versions and we are running into a few issues. Issue 1:
No clear way to match LLM call with the related Chain, data captured in both is needed to calculate our observabiltiy metrics
We have relied on string and time matching, seems to be working but a way to configure a common id would be ideal
Issue 2:
LLM calls from other agents, are showing up in other agents projects in arize phoenix
For example, in the invoker_health_check agent, we are seeing LLM calls from Sourcing_Agent, and vice-versa
We believe this may be because we are calling 5 Agents in Langflow through the API in a parallel python async call
But we are unsure how to fix this
Issue 3:
Chat input observability not working:
Sometimes in the chat input span in the attributes.input.value field the entire system instructions are showing up
Almost always in the chat input span in the attributes.output.value the field is blank even when we know something was sent through
Issue 4 OpenAI span observability issues: For OpenAI components that get called using a chat input in Langflow sometimes the content field is completely omitted "input_messages": [ { "message": { "role": "system", "content": "our system instructions" } }, { "message": { "role": "user" } } ], Issue 5 Agent observability issues: For a specific agent the LLM span is not coming through anywhere within any projects Further details:
We are using the native Arize Phoenix intergration, no code changes have been made
We are calling about 10 agents as Langflow Flows, through the Langflow API in python
We are still on Langflow version 1.5.1, looking to begin moving to 1.6
We are using Arize Phoenix version 11.37.0
