lauching a script using run_experiment can't stop with ctrl-c, what's the recommended way to stop an experiment after it gets started?
It seems likely that recent versions of Phoenix (8.28+) have improved shutdown handling, so stopping a script running run_experiment with Ctrl+C should now exit cleanly without a traceback or non-zero exit code. Previously, Ctrl+C would show a KeyboardInterrupt and CancelledError, but now the interrupt is swallowed and the process exits with code 0, reflecting intentional termination. This applies to both server and experiment runs in the terminal or notebook environments, so Ctrl+C is the recommended way to stop an experiment in progress (release notes, PR #7305).If you are running Phoenix in a notebook, you can also use px.close_app() to gracefully shut down the background session (manage the app). Would you like more detail on handling shutdowns in specific environments or troubleshooting issues with Ctrl+C?Sources:
- feat: gracefully handle ctrl-c
- release notes
- manage the app
馃挕聽Hint: Mention RunLLM in the thread for followups.
I'm using arize-phoenix 8.32.0
it just doesn't respond to ctrl-c and keeps running
it seems spawning many processes under the hood when checking with ps ax
it's not stuck. I just realize some error in my task, so would like to stop the expeirment, fix the bug, and rerun.
I mostly ask because run_experiment should not be spawning processes unless your task is doing so
are you running run_experiment in sync or async mode?
(if you are using nest_asyncio then it should be running in async mode by default)
nope. my script is very simple, basically just
experiment = run_experiment(
dataset,
my_task,
evaluators=[eval_1, eval_2],
experiment_metadata={"model": f"{_SELECT_MODEL}"},
# dry_run=1,
)with my_task being just a simple llm call and a few custom evaluators
run_experiment should not be spawning processes unless your task is doing so
doesn't run_experiment process all the rows in parallel in some fashion?
or it's just processing every row in dataset one by one?
it processes them in parallel if you're using nest_asyncio in a notebook or separately in a script
