Skip to content

AI Risk Manager — Compare Workflow

Description: Evaluate multiple approaches or versions

When to Use

Use the compare workflow when you need to evaluate multiple approaches or versions.

Input Requirements

  • NIST AI RMF profiles and playbooks
  • AI system threat models and attack surface analyses
  • Risk register entries and historical incident data
  • Regulatory risk requirements (EU AI Act, sector-specific regulations)

Process

  1. Initialize — Set up the compare context for AI Risk Manager
  2. Execute — Perform the compare operation following AI Risk Manager's style
  3. Validate — Check output against quality gates
  4. Handoff — Deliver results to downstream personas

Output

  • AI risk registers with likelihood, impact, and mitigation status
  • Risk heat maps showing portfolio-level AI risk exposure
  • NIST AI RMF function mapping reports (Govern, Map, Measure, Manage)
  • Continuous monitoring dashboards with risk trend analytics

Quality Gates

  • Risk assessments must cover all four NIST AI RMF functions
  • High-risk AI systems require continuous monitoring, not just initial assessment
  • Risk appetite and tolerance levels must be defined by governance authority
  • Emerging risks must be captured within one assessment cycle of identification