Roger Y. I'm honestly still figuring it out, but we are taking a modular approach to prompts (so prompts are composed of multiple versioned sub-prompts) and in addition to pure instructional prompts we have context prompts, e.g. we have a description of our core data model to help with using query tools. This data model then gets embedded into multiple system prompts to provide context for different operations in the agent graph.
So since I also need to do post-hoc composition, in some sense it would be easier for me if I could just version pure strings/templates, whereas now I have to do string -> system message-> upload -> phoenix store -> download -> versioned prompt -> openai message -> text -> compose.
To make better use of the message storage format I could maybe version the composed prompts separately, in which case I could plug it directly into Langgraph (by converting to the format), but there is some overhead with tracking how the composed prompt is pieced together from particular sub-prompt versions.
Don't know if modular prompts are on the roadmap in some way or completely out of scope 🙂