Hello team, I have a question.. is the Phoenix Prompt management be used as prompt library, versioning, maintenance? How is this backed?
Yes, Phoenix Prompt Management is designed to serve as a prompt library with full versioning and maintenance capabilities. You can create, store, modify, and tag prompts, track changes over time, and collaborate with others. Each prompt has a linear version history, supports tagging (e.g., "production", "staging"), and can be cloned or edited from any version. All prompt data, including versions and tags, is stored and managed within the Phoenix backend, accessible via the UI or SDKs, and is backed by the Phoenix server鈥檚 persistent storage system for reliability and auditability. Prompt management is accessible via the Phoenix UI and both Python and TypeScript SDKs, allowing you to pull prompts by name, version, or tag, and use them programmatically in your applications. This system is designed for high availability, with planned support for caching and fallback mechanisms to ensure reliability even during network issues. For more, see Prompt Management, How to: Prompts, and Using a prompt.Would you like more detail on how to use the SDKs or how versioning works internally?Sources: - Prompt Management - How to: Prompts - Using a prompt
馃挕聽Hint: Mention RunLLM in the thread for followups.
where are these prompts stored? in the Postgres DB? what is the recommendation to update the prompts? form UI or from code?
thanks Dylan, how is these records stored in Postgres, is it like a Blob or what is the datatype and also can we control how they are stored?
sorry its self hosted option
thank you, that answers my question
馃敀[private user] - We are not able to update the promts. Its self hosted solution having Azure postgres as DB. Old Prompts are getting updated but new ones are not getting updated. Strange. What are we missing here ? I saw an GitHub Issue which increased the limit to 200 (https://github.com/Arize-ai/phoenix/issues/7648) How can it be increased as we are unable to update the prompts.
