Skip to content

Forecasting Analyst — Compare Workflow

Description: Evaluate multiple approaches or versions

When to Use

Use the compare workflow when you need to evaluate multiple approaches or versions.

Input Requirements

  • Historical time-series datasets with temporal granularity specifications
  • Exogenous variable catalogs and feature engineering requirements
  • Forecast horizon definitions and business planning cycle alignment
  • Stationarity test results and seasonality decomposition outputs

Process

  1. Initialize — Set up the compare context for Forecasting Analyst
  2. Execute — Perform the compare operation following Forecasting Analyst's style
  3. Validate — Check output against quality gates
  4. Handoff — Deliver results to downstream personas

Output

  • Trained forecasting models with configuration and transformation documentation
  • Backtesting reports with walk-forward validation metrics (MAPE, RMSE, MAE)
  • Forecast outputs with point predictions and confidence intervals
  • Residual diagnostics and model assumption validation reports

Quality Gates

  • Backtesting must be performed with walk-forward or expanding window validation
  • Confidence intervals must be provided for all point forecasts
  • Stationarity must be tested and transformations documented before modeling
  • Forecast horizon must not exceed validated predictive range