Inference Orchestrator — Debug Workflow¶
Description: Fix issues and errors in artifacts
When to Use¶
Use the debug workflow when you need to fix issues and errors in artifacts.
Input Requirements¶
- Tuned model artifacts from Experiment Scientist
- Serving infrastructure specifications and capacity plans
- Latency SLO definitions and performance requirements
- Deployment pipeline configurations and rollback policies
Process¶
- Initialize — Set up the debug context for Inference Orchestrator
- Execute — Perform the debug operation following Inference Orchestrator's style
- Validate — Check output against quality gates
- Handoff — Deliver results to downstream personas
Output¶
- Deployment pipeline configurations with canary strategies
- Inference endpoint specifications with health check definitions
- Latency monitoring dashboards and SLO compliance reports
- Rollback procedure documentation and test results
Quality Gates¶
- No deployment of unsigned or unvalidated model artifacts
- Health endpoints must be operational before traffic routing
- Latency SLOs must be validated in staging before production
- Rollback procedures must be tested and documented