Hey, thanks π[private user] and π[private user]. I think it's very helpful. Another quick question, what should be the format of the tool definition string? Could you provide us an example of one of the outputs for it?
Hey, Xander S.. It's me again π . Regarding to this point that LLMApp mentioned:
β’ Your model supports function calling,
ββ’ Your prompt template is correctly structured to yield an explanation,
ββ’ And that your DataFrame includes a properly named βtool_callβ column with exactly the expected content.
do you think this could be causing the issue?
in the newest version you are installing ! pip install -qq "arize-phoenix==6.1.0" "llama-index-llms-openai==0.2.2" "openai===1.57.0" gcsfs==2024.10.0 nest_asyncio==1.6.0 langchain==0.3.10 langchain-openai==0.2.11 openinference-instrumentation-langchain==0.1.29 but I am importing this: ! pip install -qq "arize-phoenix[llama-index]" "arize-phoenix[evals]>=0.14.0" "llama-index-llms-openai" "openai>=1" gcsfs nest_asyncio langchain langchain-openai