Thanks, Roger -- I tried but still have the same problem, so I'll wait a bit to let things ripple through the system and then try another attempt. Regards, David
Great - thanks very much, Mikyo! I'll give it a try... Regards, David
Hi -- I'm slowly upgrading my code to LlamaIndex 0.10, and upgraded Phoenix to 3.0.3 and openinference-instrumentation-llama-index==1.0.0. However, I get conflicting demands on the packages to install: OpenAIInstrumentor().instrument() import llama_index.core.callbacks, llama_index.core.callbacks.base_handler from llama_index.core.callbacks.base_handler import BaseCallbackHandler from phoenix.trace.llama_index import OpenInferenceTraceCallbackHandler callback_handler = OpenInferenceTraceCallbackHandler() set_global_handler("arize_phoenix") llama_index.set_global_handler("arize_phoenix") ---------------------------------------------------------------------
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\callbacks\global_handlers.py:42, in create_global_handler(eval_mode, **eval_params)
41 try:
---> 42 from llama_index.callbacks.arize_phoenix import (
43 arize_phoenix_callback_handler,
44 ) # pants: no-infer-dep
45 except ImportError:
ModuleNotFoundError: No module named 'llama_index.callbacks'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
Cell In[3], line 10
7 # Initialize the callback handler
8 callback_handler = OpenInferenceTraceCallbackHandler()
---> 10 set_global_handler("arize_phoenix")
11 llama_index.set_global_handler("arize_phoenix")
File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\callbacks\global_handlers.py:11, in set_global_handler(eval_mode, **eval_params)
8 """Set global eval handlers."""
9 import llama_index.core
---> 11 llama_index.core.global_handler = create_global_handler(eval_mode, **eval_params)
File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\callbacks\global_handlers.py:46, in create_global_handler(eval_mode, **eval_params)
42 from llama_index.callbacks.arize_phoenix import (
43 arize_phoenix_callback_handler,
44 ) # pants: no-infer-dep
45 except ImportError:
---> 46 raise ImportError(
47 "ArizePhoenixCallbackHandler is not installed. "
48 "Please install it using `pip install llama-index-callbacks-arize-phoenix`"
49 )
51 handler = arize_phoenix_callback_handler(**eval_params)
52 elif eval_mode == "honeyhive":
ImportError: ArizePhoenixCallbackHandler is not installed. Please install it using `pip install llama-index-callbacks-arize-phoenix`Installing llama-index-callbacks-arize-phoenix forces my Phoenix version back down to 2.11.1:
Installing collected packages: arize-phoenix
Attempting uninstall: arize-phoenix
Found existing installation: arize-phoenix 3.0.3
Uninstalling arize-phoenix-3.0.3:
Successfully uninstalled arize-phoenix-3.0.3
Successfully installed arize-phoenix-2.11.1Also, from the first exception above (No module named llama_index.callbacks), here's a further Traceback. Is there anything I can do from my side to reference this module properly, or does the change need to be made in Phoenix? ----------------------------------------------------------------------------------- ModuleNotFoundError Traceback (most recent call last) Cell In[8], line 5 2 import llama_index.core.callbacks, llama_index.core.callbacks.base_handler 4 from llama_index.core.callbacks.base_handler import BaseCallbackHandler ----> 5 from phoenix.trace.llama_index import OpenInferenceTraceCallbackHandler 7 # Initialize the callback handler 8 callback_handler = OpenInferenceTraceCallbackHandler() File ~\AppData\Local\anaconda3\envs\test\Lib\site-packages\phoenix\trace\llama_index\__init__.py:1 ----> 1 from .callback import OpenInferenceTraceCallbackHandler 2 from .debug_callback import DebugCallbackHandler 4 __all__ = ["OpenInferenceTraceCallbackHandler", "DebugCallbackHandler"] File ~\AppData\Local\anaconda3\envs\test\Lib\site-packages\phoenix\trace\llama_index\callback.py:33 30 from uuid import uuid4 32 import llama_index ---> 33 from llama_index.callbacks.base_handler import BaseCallbackHandler 34 from llama_index.callbacks.schema import ( 35 TIMESTAMP_FORMAT, 36 CBEvent, 37 CBEventType, 38 EventPayload, 39 ) 40 from llama_index.llms.types import ChatMessage, ChatResponse ModuleNotFoundError: No module named 'llama_index.callbacks'
OK, great -- thanks, Roger!
Hi -- I'm running code with Phoenix 3.0 where I create queries and then switch to an eval context for evaluation. I'm getting a non-catastrophic error with every query I evaluate:
0%| | 0/3 [00:00<?, ?it/s]Failed to detach context
Traceback (most recent call last):
File "C:\Users\test\Lib\site-packages\opentelemetry\context\__init__.py", line 163, in detach
_RUNTIME_CONTEXT.detach(token) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\test\Lib\site-packages\opentelemetry\context\contextvars_context.py", line 50, in detach
self._current_context.reset(token) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: <Token var=<ContextVar name='current_context' default={} at 0x000001F8423CA570> at 0x000001F84262A840> was created in a different ContextPhoenix still seems to log the information ok, but I was wondering if there's any lasting negative implication (besides the error message, which would be great to eliminate, since it already says #type: ignore). I can demonstrate in a live session with someone if it helps. Regards, David
Hi Xander -- I DM'd you a longer trace to check out. The code itself is as shown originally: Create and save a trace, and then immediately try reloading the session via the commands shown. Thanks and regards, David
Hi - Congratulations on the 3.0 release! I've downloaded it and have been working through the conversion process. However, there still doesn't seem to be a way to properly load from a saved tracing session. I'm able to successfully save as follows: from phoenix.trace import TraceDataset dataset = px.active_session().get_trace_dataset() dataset_id=dataset.save() However, when I try loading I continue to get errors: tds = TraceDataset.load(dataset_id) px.launch_app(trace=tds) px.launch_app(trace=px.TraceDataset(dataframe=tds.dataframe)) ######### (tried this instead of the line above, but still the same error)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[23], line 9
7 tds = TraceDataset.load(dataset_id)
8 #px.launch_app(trace=tds)
----> 9 px.launch_app(trace=px.TraceDataset(dataframe=tds.dataframe))
TypeError: float() argument must be a string or a real number, not 'dict'Could you please advise on a workable solution for this in 3.0? I couldn't find any new doc that advised me differently, but I may have missed it. Thanks and regards, David
Hi -- I love the visualization tool, but I find that usually when I mouse over a document in the point cloud, the big description box obscures the arrow paths that point from that document to other documents in the corpus. Could you perhaps relocate the description box to a neutral area of the screen? Regards, David
Hi Mikyo -- "David, if you can remove some of the duplicate instrumentation." I'm not sure which instrumentation is duplicated. for a local LLM, am I able to delete anything from A,C,D and still have tracing work properly? Perhaps you're referring to the following lines, one of which could likely be deleted? set_global_handler("arize_phoenix") llama_index.set_global_handler("arize_phoenix") Thanks and regards, David
Hi Roger --
Local LLM does not return query chains if I omit A or C (but include B&D).
"So any time C is included, the get_retrieved_documents should work because that鈥檚 provided by the Llama-Index callback system" I find this is only true if both A and C are included. If A is omitted, C will only result in my previous issue of missing query traces, and get_retrieved_documents returns an empty frame. That said, I think A is needed because I'm using a local LLM, but your statement may hold true for OpenAI LLMs. Thanks and regards, David
