Hi Arize team 👋
We are facing an issue while using an Amazon Bedrock integrated model in Arize.
Issue details:
- Model: llama 3.3 70B instruct (via Bedrock)
- Location: Prompt Hub → Playground
- Error shown: “Unable to generate new response for one or more results”
- Same prompt works fine with: Claude Haiku 3.5
- Input is very simple (example: “Generate a haiku about {topic}”)
What we’ve observed:
- The playground loads correctly
- Variables are passed correctly
- Execution fails only for LLaMA 3.3 70B
- Claude models work without any issue
Questions:
1. Is LLaMA 3.3 70B fully supported via Bedrock in Prompt Hub?
2. Are there known limitations (timeout, token limits, region restrictions)?
3. Any required parameter changes (max tokens, temperature, stop sequences)?
4. Any logs or debug view we should check inside Arize?
Screenshot attached for reference.
Thanks in advance for your help!