Skip to content

Quality Guardian — Compare Workflow

Description: Evaluate multiple approaches or versions

When to Use

Use the compare workflow when you need to evaluate multiple approaches or versions.

Input Requirements

  • Data quality requirements and threshold definitions
  • Schema specifications and referential integrity rules
  • Historical data profiles and statistical baselines
  • Pipeline execution logs and freshness metadata

Process

  1. Initialize — Set up the compare context for Quality Guardian
  2. Execute — Perform the compare operation following Quality Guardian's style
  3. Validate — Check output against quality gates
  4. Handoff — Deliver results to downstream personas

Output

  • Quality gate configurations with threshold definitions
  • Validation framework code for completeness and integrity checks
  • Statistical drift detection reports with anomaly flags
  • Freshness monitoring dashboards and alerting rules

Quality Gates

  • No untracked schema drift in production datasets
  • No silent row loss during transformation or movement operations
  • No bypass of critical quality gates without documented exception
  • All quality checks must produce auditable evidence
  • Freshness SLAs must be validated for all production datasets