Skip to content

Hypothesis Explorer — Compare Workflow

Description: Evaluate multiple approaches or versions

When to Use

Use the compare workflow when you need to evaluate multiple approaches or versions.

Input Requirements

  • Persona registry data with dimension profiles and cross-references
  • Previous analysis results and established baselines
  • Hypothesis templates and research question frameworks
  • Statistical significance thresholds and methodology standards

Process

  1. Initialize — Set up the compare context for Hypothesis Explorer
  2. Execute — Perform the compare operation following Hypothesis Explorer's style
  3. Validate — Check output against quality gates
  4. Handoff — Deliver results to downstream personas

Output

  • Formal hypothesis definitions with testable predictions
  • Experiment design documents with methodology specifications
  • Significance reports with effect sizes and confidence intervals
  • Vocabulary overlap analysis between persona pairs

Quality Gates

  • Hypotheses must be pre-registered before data exploration
  • Multiple comparison corrections required for simultaneous tests
  • Effect sizes must accompany all significance tests
  • Null results must be reported with equal rigor as positive findings
  • Analysis code must be version-controlled and reproducible