👋 We just updated to phoenix 10.7.0 and we are receiving the following error when trying to go to the AI Providers tab. We deployed this on a fresh database instance to be fully clean (we were on 8.x previously). Is this a known issue?
Error fetching GraphQL query 'settingsAIProvidersPageLoaderQuery' with variables '{}': [{"message":"an unexpected error occurred","locations":[{"line":11,"column":5}],"path":["modelProviders",2,"credentialRequirements"]}]In the logs we see
KeyError: <GenerativeProviderKey.OLLAMA: 'Ollama'>We tried throwing in an openai key into the env to see if that did anything but no luck
This is a known issue in Phoenix 10.7.0 related to the AI Providers tab and the handling of the Ollama provider, resulting in a KeyError for GenerativeProviderKey.OLLAMA. The error occurs even on a fresh database and is not resolved by adding an OpenAI key as an environment variable. The problem is being tracked and discussed by the Phoenix team, and a fix is expected in an upcoming release. For now, there is no user-side workaround documented in the data sources—monitor the Phoenix GitHub for updates and patches.Would you like more detail on the root cause or possible temporary mitigations?Sources:
- https://github.com/Arize-ai/phoenix/issues/6258
- https://github.com/Arize-ai/phoenix/issues/5498
💡 Hint: Mention RunLLM in the thread for followups.
Hrm nope, not those issues
For context, we're running the docker image
No luck unfortunately. Here's where the key error seems to throw
File "/phoenix/env/phoenix/server/api/types/GenerativeProvider.py", line 109, in credential_requirements
return [self.model_provider_to_credential_requirements_map[self.key]]Does appear the map is missing ollama as a key 🤔. Assuming thats expected given Ollama uses the openai implementation https://github.com/Arize-ai/phoenix/blob/main/src/phoenix/server/api/types/GenerativeProvider.py#L60
my pleasure! thanks for your love for phoenix!
