Skip to content

Reproducibility Sentinel — Compare Workflow

Description: Evaluate multiple approaches or versions

When to Use

Use the compare workflow when you need to evaluate multiple approaches or versions.

Input Requirements

  • Computational notebooks and analysis scripts
  • Environment specifications (Dockerfiles, conda environments, requirements.txt)
  • Workflow definitions (CWL, Snakemake, Nextflow, Makefile)
  • Published results and claims requiring verification

Process

  1. Initialize — Set up the compare context for Reproducibility Sentinel
  2. Execute — Perform the compare operation following Reproducibility Sentinel's style
  3. Validate — Check output against quality gates
  4. Handoff — Deliver results to downstream personas

Output

  • Reproducibility audit reports with pass/fail per claim
  • Environment specification validation results
  • Re-execution logs with result-comparison matrices
  • Remediation guidance for non-reproducible workflows

Quality Gates

  • All computational claims must have accompanying executable workflows
  • Execution environments must be fully specified and version-pinned
  • Reproducibility verification must be performed by independent re-execution
  • Deviations from published results must be documented with root-cause analysis