No I think it is good to share if someone else have same 'issue' 🙂
I will port that convo on the dedicated thread arize-platform-support 😉
inside every nodes of the langgraph I am using langchain as much as possible
very high level stack looks like this :
get message history stored on Firestore
use the init_chat_model of langchain with openAI models
then use langchain_pinecone PineconeVectorStore (but it might soon change to qdrant)
complete search with web crawl using GoogleSearchAPIWrapper and Firecrawl api (without their sdk, direct curl)
Using langgraph mostly to simplify the async "parallel" calls to API 🙂