This starter template is a structured, copy/paste-ready “container” for converting workshop answers into a usable requirements package that teams can actually build and test against. It provides clear sections for the most important AI requirements categories: use case and AI role; scope boundaries; users and workflow context; measurable success metrics; inputs and data sources with authoritative “source of truth” rules; access and privacy constraints (least privilege, retention, logging, and training usage); output contract (format, required fields, tone/style); guardrails (refusals, disallowed content/actions, escalation triggers); fallback behavior for low confidence; quality and non-functional requirements (metrics, thresholds, latency); testing and acceptance (golden dataset and sign-offs); monitoring and operations (signals, alerts, incident triggers); and change control (versioning, approvals, rollback, audit trail).
The template is intentionally lightweight: it helps teams document what they know, flag what they don’t know as explicit TBDs, and generate discovery tasks rather than relying on assumptions. BAs can use it as a PRD section, a Confluence page, an epic attachment, or a governance artifact—especially valuable for regulated environments where traceability, accountability, and auditability matter.