Hi 馃敀[private user] and team, I moved the question here as its an AX question:
Hey I'm trying ollama and it doesn't seem to be working with structured outputs, any idea why?
ollama should also be compatible with that, no? I set up ollama via the custom model endpoints which are openai api standards, no? 馃敀[private user]?
Also any idea if/when I can send an array of items or do logic such as jinja in prompts/experiments?