Audit Readiness

  • Updated

The Audit Readiness page allows organizations to define mandatory approval gates that AI systems must pass through before they can be approved for use. These gates help standardize review processes across AI systems and enforce audit-aligned governance requirements.

This feature enables teams to operationalize internal AI approval workflows and align with governance standards such as the Australian AI Technical Standards (DTE) or other globally recognized frameworks.

Purpose

Audit gates are used to ensure that required documentation, controls, and reviews are in place before an AI system is approved. Each gate can include:

  • A descriptive name (e.g., "Security Review", "Privacy Assessment")

  • A list of required artifacts (such as completed risk assessments or TEVV documents)

  • A set of designated reviewers (roles or individuals who must sign off)

By defining gates upfront, organizations can ensure that all AI systems meet consistent readiness criteria before going live.

Key Features

  • Add Gate
    Click the + Add gate button to configure a new audit gate. Each gate includes:

    • Gate Name – A descriptive label for the checkpoint (e.g., "Fairness Review", "Technical Validation")

    • Required Artifacts – Documents or evidence that must be completed before passing the gate

    • Reviewers – Individuals or roles responsible for reviewing and approving the gate

  • Gate List
    Once added, gates appear in sequence, showing the required artifacts and reviewers for each. This allows all stakeholders to view the approval flow.

  • Gate Enforcement
    Gates are used as part of the organization’s internal AI approval process. AI systems may not be marked as "Approved" until they have passed all defined gates.

Notes

  • Audit gates are designed to formalize governance checkpoints in the AI lifecycle.

  • Required artifacts should reflect your organization's AI assurance criteria, including risk mitigation, legal sign-off, and documentation of compliance.

  • This feature can support adoption of emerging standards, such as those defined by Australia’s DTE or similar national frameworks.

  • Gate reviewers may include cross-functional roles such as legal, security, privacy, or data science leads.

  • Once configured, these gates apply to AI system evaluations and support a clear, documented path to approval.

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request