Hi, can we have multiple projects for a single phoenix instance. Like now I have projects/default can I have prjects/dev1/dev2/dev3 If yes then how can i acheive that
If you are talking about creating multiple projects, you can simply do that by using this in your openinference code:
resource = Resource(attributes={ResourceAttributes.PROJECT_NAME: project_name})
tracer_provider = trace_sdk.TracerProvider(resource=resource)Where project_name would be your project name. In this way, if you were to send some spans using a project name "dev", you would see two projects in the dashboard now, default and dev. If you are referring to creating projects inside a project, I am not sure if phoenix supports that at the moment. You can still achieve this behavior by using a prefix name in your projects. Ref: https://docs.arize.com/phoenix/deployment/deploying-phoenix#initialize-instrumentation
thanks Nabeegh! Alternatively we provide a using_project context manager that is convenient for working in a notebook environment, and you want different sells to send traces to different projects
Yes it worked for me thanks a lot for the help
One more thing actually I am trying to run some queries and below are my invocation params ss where streaming is not set to false Is this the reason for not getting the token count
a lot of providers will not supply the token count when streaming
How can I get the token count then ?
if you're using streaming and the provider doesn't supply that information at the moment you either have to estimate using a token counting tool or see if you can use the model without streaming
on our roadmap is to provide utilities to track tokens even if the model provider does not supply all that info
Llamaindex supports token count ?
it depends, I believe they support it for some models
I am using the gpt-35-turbo model
I don't stay up to date on what modes llamaindex model wrappers support token counts for, since this depends on both OpenAI and llamaindex
if llamaindex provides a token count in their callback, we will display it
