Skip to content

Multi-Persona Orchestration Prompts

Sample prompts for coordinating multiple FCC personas in complex, multi-step workflows. These prompts demonstrate three-persona protocol sprints, JV evaluation pipelines, full-stack visualization teams, six-persona governance reviews, champion-led orchestration, and cross-category pipeline composition. Each prompt specifies persona roles, handoff protocols, and expected deliverables.


Table of Contents

  1. Three-Persona Protocol Sprint (ASD + MTA + PCA)
  2. JV Pipeline (IEA + PCO + OSC)
  3. Full-Stack Visualization (DVA + IDD + RER + UAA)
  4. Six-Persona Governance Review (DGS + GCA + PCA + OSC + JDA2 + PTE)
  5. Champion Orchestration (RCHM orchestrating RC + CIA + STE + RIC)
  6. Cross-Category Protocol-to-Viz Pipeline (ASD + DVA + UAA)

1. Three-Persona Protocol Sprint (ASD + MTA + PCA)

Prompt: Design and Validate a New Protocol Bridge

You are coordinating a three-persona protocol sprint to design,
implement, and validate a new protocol bridge connecting the FCC
EventBus to an external A2A-compliant agent system.

TEAM COMPOSITION:
- A2A Skill Designer (ASD): Lead protocol designer
- MCP Tool Architect (MTA): Tool schema and resource designer
- Protocol Compliance Auditor (PCA): Conformance validator

SPRINT STRUCTURE:

PHASE 1 -- DESIGN (ASD leads):
The A2A Skill Designer produces the bridge specification.

1. Define the bridge agent card:
   - agent_id: fcc-event-bridge
   - protocol: A2A v1.0
   - skills: event_subscribe, event_publish, event_replay
   - constraints: max 100 concurrent subscriptions, 1MB max message

2. Design skill schemas:
   - event_subscribe: { event_types: string[], filter?: EventFilter }
   - event_publish: { event: Event, target?: string }
   - event_replay: { from_sequence: int, to_sequence: int }

3. Define interaction patterns:
   - Synchronous: subscribe/unsubscribe operations
   - Streaming: continuous event delivery via A2A stream
   - Async: replay requests with progress callbacks

4. Handoff to MTA: Package specification as structured JSON
   with schema references and example payloads.

PHASE 2 -- IMPLEMENTATION (MTA leads):
The MCP Tool Architect designs the tool layer.

1. Map A2A skills to MCP tools:
   - Tool: bridge_subscribe (wraps event_subscribe skill)
   - Tool: bridge_publish (wraps event_publish skill)
   - Tool: bridge_replay (wraps event_replay skill)
   - Tool: bridge_status (health check, connection count, lag)

2. Define MCP resources:
   - fcc://bridge/subscriptions (active subscription list)
   - fcc://bridge/events/{sequence} (individual event lookup)
   - fcc://bridge/metrics (throughput, latency, error rate)

3. Design error handling:
   - 400: Invalid subscription filter
   - 404: Event sequence not found in replay window
   - 429: Subscription limit exceeded
   - 503: Bridge temporarily unavailable

4. Handoff to PCA: Provide complete tool schemas, resource
   definitions, and example request/response pairs.

PHASE 3 -- VALIDATION (PCA leads):
The Protocol Compliance Auditor validates conformance.

1. A2A compliance checks:
   - [ ] Agent card conforms to A2A schema
   - [ ] All skills have versioned input/output schemas
   - [ ] Backward compatibility maintained for v1.x clients
   - [ ] Capability declarations are machine-parseable

2. MCP compliance checks:
   - [ ] Tool schemas conform to MCP specification
   - [ ] Resource URIs follow URI template standard
   - [ ] Error responses use standard error codes
   - [ ] Cache policies defined for all resources

3. Integration checks:
   - [ ] EventBus event types map correctly to bridge events
   - [ ] Event serialization preserves all fields
   - [ ] Sequence numbers are monotonically increasing
   - [ ] Replay window covers at least 10,000 events

4. Produce audit report:
   | Check | Status | Severity | Evidence | Remediation |
   |-------|--------|----------|----------|-------------|

SPRINT DELIVERABLES:
1. Bridge agent card specification (ASD)
2. MCP tool schemas and resource definitions (MTA)
3. Compliance audit report with pass/fail per check (PCA)
4. Integration test plan covering all three phases
5. Sprint retrospective with improvement recommendations

2. JV Pipeline (IEA + PCO + OSC)

Prompt: Evaluate a New Joint Venture Partnership End-to-End

You are coordinating a three-persona JV evaluation pipeline to
assess, structure, and validate a proposed joint venture partnership
for integrating the Distiller NanoCube engine with the FCC framework.

TEAM COMPOSITION:
- IP Evaluation Analyst (IEA): Dual-axis IP assessment
- Partnership Coordinator (PCO): Partnership structuring and contracts
- Open Science Compliance Officer (OSC): Compliance validation

PROPOSED PARTNERSHIP:
- Partner: Distiller/Fornax Project
- IP under evaluation: NanoCube hierarchical query engine
- Integration scope: FCC persona dimension analysis, discernment
  matrix aggregation, and real-time event visualization
- Term: 2-year initial term with annual renewal option

PHASE 1 -- IP EVALUATION (IEA leads):

1. TECHNICAL FIT ASSESSMENT (score 0-10 each):
   - Novelty: How original is the NanoCube engine?
   - Feasibility: Can it integrate with FCC's Python stack?
   - Scalability: Will it handle 102 personas x 56 dimensions?
   - Maintainability: Can the FCC team contribute patches?
   - Interoperability: Does it support A2A/MCP protocols?

2. STRATEGIC VALUE ASSESSMENT (score 0-10 each):
   - Market Alignment: Does NanoCube address real user needs?
   - Competitive Advantage: Does it differentiate the ecosystem?
   - Ecosystem Fit: Does it complement FCC + PAOM + Doc Platform?
   - Investment Efficiency: Is integration cost justified?
   - Risk Mitigation: Does it reduce single-vendor risk?

3. QUADRANT CLASSIFICATION:
   - Star (both high): Proceed immediately
   - Strategic (low tech, high strategic): Invest in tech ramp
   - Technical (high tech, low strategic): Seek better alignment
   - Review (both low): Decline or defer

4. Handoff to PCO: IP evaluation report with scoring matrices,
   quadrant classification, and recommendation.

PHASE 2 -- PARTNERSHIP STRUCTURING (PCO leads):

1. Design exchange contract:
   - contract_id: "FCC-DISTILLER-001"
   - Data flows: persona dimensions (out), NanoCube results (in)
   - Protocol bindings: nanocube_query, fabrication_request MCP tools
   - SLA: < 2s query response, 99.5% availability

2. Define governance framework:
   - Joint steering committee (quarterly review)
   - Schema change notification protocol (2-week advance)
   - Breaking change policy (major version bump required)
   - Dispute resolution escalation path

3. Structure IP terms:
   - MIT licensing for all shared code
   - CC-BY 4.0 for shared documentation
   - Independent IP retention for core engines
   - Joint IP for integration layer

4. Handoff to OSC: Complete partnership package with contract,
   governance framework, and IP terms.

PHASE 3 -- COMPLIANCE VALIDATION (OSC leads):

1. Open science gate assessment:
   - FAIR-FIND: Are shared artifacts identifiable?
   - FAIR-ACCESS: Are access protocols standardized?
   - FAIR-INTEROP: Are data formats interoperable?
   - FAIR-REUSE: Are licenses clear and permissive?
   - REPRO-CODE: Is shared code reproducible?
   - REPRO-DATA: Is data processing reproducible?

2. Regulatory compliance:
   - Data sovereignty: Where is data stored and processed?
   - Privacy: Does the integration handle any PII?
   - Export controls: Are there restrictions on shared algorithms?

3. Risk assessment:
   - Vendor lock-in risk with NanoCube dependency
   - Data leakage risk across project boundaries
   - License compatibility risk between MIT and partner licenses

4. Produce compliance certification:
   | Gate | Status | Conditions | Expiry |
   |------|--------|------------|--------|

PIPELINE DELIVERABLES:
1. IP evaluation scorecard with quadrant classification (IEA)
2. Exchange contract and governance framework (PCO)
3. Compliance certification with gate results (OSC)
4. Joint recommendation: Proceed / Conditional / Decline
5. 90-day integration roadmap with milestones

3. Full-Stack Visualization (DVA + IDD + RER + UAA)

Prompt: Create an Accessible, Real-Time Dashboard

You are coordinating a four-persona visualization team to design and
build an accessible, real-time dashboard for monitoring FCC simulation
execution across all 102 personas.

TEAM COMPOSITION:
- D3 Visualization Architect (DVA): Chart components and data visualization
- Interactive Dashboard Designer (IDD): Layout, navigation, and state management
- Real-time Event Renderer (RER): Live event streaming and rendering pipeline
- UX Accessibility Auditor (UAA): WCAG compliance and inclusive design

DASHBOARD REQUIREMENTS:
- Monitor active simulations across the 20 persona categories
- Display real-time event streams from the EventBus (81 event types)
- Visualize persona activity, workflow progress, and quality scores
- Support 100 concurrent viewers via WebSocket connections

PHASE 1 -- COMPONENT DESIGN (DVA leads):

1. Persona Activity Grid:
   - 102 tiles arranged by category (20 groups)
   - Tile color indicates activity state (idle/active/error)
   - Tile size proportional to event count in current session
   - Click to drill down into persona detail view

2. Workflow Progress Chart:
   - Horizontal Gantt-style timeline
   - Swimlanes per workflow phase (Find, Create, Critique)
   - Progress bars showing step completion percentage
   - Milestone markers for phase transitions

3. Quality Score Radar:
   - Radar chart with 6 axes (one per discernment trait)
   - Overlaid polygons for current vs target scores
   - Animated transitions on score updates
   - Category selector for filtering

4. Event Volume Sparklines:
   - Small multiples: one sparkline per event type
   - 60-second rolling window, 1-second resolution
   - Peak indicators and trend arrows
   - Click to expand into full histogram

Handoff to IDD: Component specifications with data interfaces,
prop types, and interaction contracts.

PHASE 2 -- DASHBOARD LAYOUT (IDD leads):

1. Layout grid:
   - 12-column responsive grid (CSS Grid)
   - Header: simulation name, status badge, elapsed time
   - Main area: Persona Grid (8 cols) + Quality Radar (4 cols)
   - Bottom: Workflow Progress (full width)
   - Sidebar: Event Sparklines (collapsible, 3 cols)

2. Navigation and state:
   - Tab bar: Overview | Personas | Workflow | Events
   - URL-based routing with deep-link support
   - Dashboard state persisted to localStorage
   - Filter toolbar: category, phase, event type, time range

3. Responsive breakpoints:
   - Desktop (>= 1200px): Full grid layout
   - Tablet (768-1199px): Stack radar below grid
   - Mobile (< 768px): Single column, cards
   - Print: Snapshot mode with static charts

4. Theme system:
   - Light and dark mode with system preference detection
   - High-contrast mode for accessibility
   - Category colors from persona YAML definitions

Handoff to RER: Layout specification with data binding points
and update frequency requirements per component.

PHASE 3 -- REAL-TIME RENDERING (RER leads):

1. Event ingestion:
   - WebSocket connection to FCC event bridge
   - Subscribe to all 81 event types with per-component filters
   - Buffer up to 500 events during heavy load
   - Reconnect with exponential backoff on disconnect

2. Render pipeline:
   - Event queue -> classifier -> component dispatcher
   - Batch updates: aggregate events per 100ms render frame
   - Priority: error events rendered immediately, info batched
   - Animation budget: 16ms per frame (60fps target)

3. State management:
   - Immutable state snapshots for each render frame
   - Diff-based updates to minimize DOM mutations
   - History buffer: last 1000 state snapshots for playback
   - Pause/resume control for debugging

4. Performance safeguards:
   - Drop low-priority events if queue exceeds 200
   - Reduce sparkline resolution under load (5s buckets)
   - Disable animations if frame rate drops below 30fps
   - Memory ceiling: 100MB for event buffer

Handoff to UAA: Running dashboard prototype with documented
interaction patterns and rendering behavior.

PHASE 4 -- ACCESSIBILITY AUDIT (UAA leads):

1. WCAG 2.1 AA compliance:
   - [ ] All text meets 4.5:1 contrast ratio (AA)
   - [ ] Interactive elements have visible focus indicators
   - [ ] Color is not the sole means of conveying information
   - [ ] All charts have text alternatives (data tables)
   - [ ] Animations respect prefers-reduced-motion

2. Keyboard navigation:
   - [ ] All components reachable via Tab key
   - [ ] Persona grid navigable with arrow keys
   - [ ] Escape closes drill-down panels
   - [ ] Keyboard shortcuts documented and discoverable
   - [ ] Focus trap in modal dialogs

3. Screen reader support:
   - [ ] ARIA landmarks for all dashboard regions
   - [ ] Live regions for real-time event announcements
   - [ ] Chart descriptions announced on focus
   - [ ] Status changes announced (simulation start/stop)
   - [ ] Data table alternatives for all visualizations

4. Inclusive design:
   - [ ] High-contrast mode meets AAA (7:1) ratios
   - [ ] Pattern fills available for chart elements
   - [ ] Font size adjustable (100% to 200%)
   - [ ] Touch targets >= 44px on mobile
   - [ ] No content requires horizontal scrolling at 320px

5. Produce audit report:
   | Component | Check | Status | WCAG Criterion | Remediation |
   |-----------|-------|--------|----------------|-------------|

TEAM DELIVERABLES:
1. D3 component library with data interfaces (DVA)
2. Dashboard layout specification with responsive breakpoints (IDD)
3. Real-time rendering pipeline with performance budgets (RER)
4. Accessibility audit report with remediation plan (UAA)
5. Integration test plan covering all four personas' contributions
6. Performance benchmark results (events/sec at target frame rate)

4. Six-Persona Governance Review (DGS + GCA + PCA + OSC + JDA2 + PTE)

Prompt: Full Governance Audit of the FCC Framework

You are coordinating a six-persona governance review team to conduct
a comprehensive audit of the FCC framework's governance posture,
covering data governance, compliance, protocol conformance, open
science, dependency management, and privacy.

TEAM COMPOSITION:
- Data Governance Specialist (DGS): Data flow and API contract audit
- Governance Compliance Auditor (GCA): Quality gate and constitution audit
- Protocol Compliance Auditor (PCA): A2A/MCP protocol conformance
- Open Science Compliance Officer (OSC): FAIR and open science gates
- JV Dependency Auditor (JDA2): Dependency and provenance audit
- Privacy Taxonomy Engineer (PTE): Privacy and data classification

AUDIT SCOPE:
- Framework version: v0.7.0
- Persona count: 102 across 20 categories
- Data assets: YAML definitions, JSON schemas, simulation traces
- Protocols: A2A, MCP, WebSocket, EventBus
- Integrations: Distiller, PAOM, Doc Platform

WORKSTREAM 1 -- DATA GOVERNANCE (DGS leads):

1. API contract inventory:
   - Catalog all internal APIs (PersonaRegistry, EventBus,
     SimulationEngine, ActionEngine, CrossReferenceMatrix)
   - Verify each API has a documented contract
   - Check contract versioning and change management

2. Data flow mapping:
   - Map data flows from YAML sources to simulation outputs
   - Identify data transformation points and validation gaps
   - Verify data retention policies are enforced

3. Service configuration audit:
   - Review all configuration files for completeness
   - Check for hardcoded values that should be configurable
   - Verify configuration validation at startup

WORKSTREAM 2 -- COMPLIANCE (GCA leads):

1. Quality gate assessment:
   - Evaluate all 25 quality gates in governance/quality_gates.yaml
   - Test gate enforcement in CI/CD pipeline
   - Verify gate severity levels are appropriate

2. Constitution audit:
   - Review 3-tier constitution (hard-stop/mandatory/preferred)
   - Verify each persona has constitution entries
   - Check for constitution conflicts across categories

3. Tag registry validation:
   - Validate all 30 tags in governance/tag_registry.yaml
   - Check tag coverage across personas and actions
   - Verify tag hierarchy consistency

WORKSTREAM 3 -- PROTOCOL CONFORMANCE (PCA leads):

1. A2A compliance:
   - Validate agent cards against a2a_card.schema.json
   - Test skill definitions for completeness
   - Verify interoperability test coverage

2. MCP compliance:
   - Validate tool schemas against mcp_tool.schema.json
   - Test resource URI resolution
   - Verify prompt template conformance

3. Event protocol compliance:
   - Validate event serialization round-trip fidelity
   - Test sequence number monotonicity
   - Verify cross-protocol event type consistency

WORKSTREAM 4 -- OPEN SCIENCE (OSC leads):

1. FAIR gate assessment:
   - Evaluate all 4 mandatory FAIR gates
   - Evaluate 2 preferred reproducibility gates
   - Score per-criterion with evidence

2. Open access review:
   - Verify all framework code is MIT-licensed
   - Check documentation licensing (CC-BY 4.0)
   - Assess data accessibility for research use

3. Citation and attribution:
   - Verify CITATION.cff or equivalent exists
   - Check third-party attribution completeness
   - Review academic citation format

WORKSTREAM 5 -- DEPENDENCY AUDIT (JDA2 leads):

1. Dependency inventory:
   - Catalog all Python dependencies (core + optional + dev)
   - Check for known vulnerabilities (CVE database)
   - Verify license compatibility for all dependencies

2. Provenance chain:
   - Trace data provenance through all 7 pipeline stages
   - Identify provenance gaps or breaks
   - Verify cryptographic integrity where applicable

3. Cross-project dependencies:
   - Map dependencies on Distiller, PAOM, Doc Platform
   - Assess vendor lock-in risk for each dependency
   - Document fallback strategies for critical dependencies

WORKSTREAM 6 -- PRIVACY (PTE leads):

1. Data classification:
   - Classify all data assets (public/internal/confidential/restricted)
   - Verify classification labels are applied consistently
   - Check for PII in test data and sample scenarios

2. Privacy impact assessment:
   - Assess persona definitions for embedded PII risk
   - Review simulation traces for data leakage potential
   - Evaluate event bus logs for sensitive content

3. Privacy taxonomy:
   - Map data elements to privacy categories
   - Verify consent requirements for data collection
   - Check data minimization practices

CONSOLIDATION:
After all six workstreams complete, produce:

1. Executive summary:
   - Overall governance maturity score (1-5)
   - Critical findings requiring immediate action
   - Strategic recommendations for governance improvement

2. Detailed findings:
   | Workstream | Finding ID | Severity | Description | Remediation |
   |------------|-----------|----------|-------------|-------------|

3. Risk register:
   | Risk | Probability | Impact | Owner | Mitigation | Deadline |
   |------|-------------|--------|-------|------------|----------|

4. Governance roadmap:
   - Q1: Critical remediations
   - Q2: Process improvements
   - Q3: Automation and tooling
   - Q4: Maturity reassessment

5. Sign-off matrix:
   | Workstream | Lead Persona | Status | Date | Notes |
   |------------|-------------|--------|------|-------|
   | Data Governance | DGS | ... | ... | ... |
   | Compliance | GCA | ... | ... | ... |
   | Protocol | PCA | ... | ... | ... |
   | Open Science | OSC | ... | ... | ... |
   | Dependencies | JDA2 | ... | ... | ... |
   | Privacy | PTE | ... | ... | ... |

5. Champion Orchestration (RCHM orchestrating RC + CIA + STE + RIC)

Prompt: Champion-Led Research Sprint

You are the Research Crafter Champion (RCHM). Orchestrate a research
sprint across your four subordinate personas to produce a unified
research package for a new FCC feature: persona dimension evolution
tracking.

CHAMPION CONTEXT:
- Champion: RCHM (Research Crafter Champion)
- Orchestrates: RC (Research Crafter), CIA (Catalog Indexer Architect),
  STE (Semantic Taxonomy Engineer), RIC (Research Inventory Crafter)
- FCC Phase: Orchestration (coordinates Find-phase personas)
- Handoff target: BCHM (Blueprint Crafter Champion) for Create phase

SPRINT OBJECTIVE:
Research and document the requirements for a persona dimension
evolution tracking system that records how persona dimension profiles
change over time as the framework evolves from v0.7.0 toward v1.0.0.

ORCHESTRATION PLAN:

STEP 1 -- SCOPE ASSIGNMENT (RCHM):
Assign research tasks to each subordinate persona.

To RC (Research Crafter):
- Research existing dimension evolution patterns in the codebase
- Survey academic literature on persona/archetype evolution models
- Identify comparable tracking systems in AI agent frameworks
- Produce a capability matrix of dimension evolution approaches
- Deliverable: Research inventory with annotated references

To CIA (Catalog Indexer Architect):
- Design the catalog schema for storing dimension snapshots
- Define indexing strategy for time-series dimension data
- Propose query patterns for comparing dimension profiles
  across versions
- Deliverable: Catalog schema with indexing specification

To STE (Semantic Taxonomy Engineer):
- Define the taxonomy for evolution event types (add, modify,
  remove, merge, split)
- Design controlled vocabulary for dimension change reasons
- Map evolution events to existing FCC taxonomy structure
- Deliverable: Evolution taxonomy with SKOS-compatible terms

To RIC (Research Inventory Crafter):
- Automate the collection of current dimension profiles
- Build a snapshot comparison tool (diff between versions)
- Generate change reports for the v0.6.0 to v0.7.0 transition
- Deliverable: Automated collection scripts and change report

STEP 2 -- PARALLEL EXECUTION (RC, CIA, STE, RIC):
Each persona works independently on their assigned tasks.
RCHM monitors progress through EventBus WORKFLOW_ACTION_STARTED
and WORKFLOW_ACTION_COMPLETED events.

Progress checkpoints:
- Day 1 end: Initial findings and draft schemas
- Day 2 end: Refined deliverables with cross-references
- Day 3 end: Final deliverables ready for synthesis

STEP 3 -- CROSS-ARTIFACT CONSISTENCY (RCHM):
Validate consistency across all four deliverables.

Consistency checks:
- [ ] RC's capability matrix references CIA's catalog schema fields
- [ ] CIA's indexing strategy supports STE's taxonomy queries
- [ ] STE's evolution event types align with RIC's change reports
- [ ] All deliverables use consistent terminology and naming
- [ ] No gaps between research scope and catalog coverage
- [ ] Taxonomy terms cover all observed evolution patterns

STEP 4 -- SYNTHESIS (RCHM):
Combine deliverables into a unified research package.

Unified package contents:
1. Executive summary: Key findings and recommendations
2. Research inventory: Annotated references (from RC)
3. Catalog specification: Schema and indexing (from CIA)
4. Evolution taxonomy: Terms and relationships (from STE)
5. Automated tools: Collection and comparison (from RIC)
6. Cross-artifact validation report (from RCHM)
7. Research readiness assessment for Create phase

STEP 5 -- HANDOFF TO BCHM:
Package the unified research for the Blueprint Crafter Champion.

Handoff checklist:
- [ ] All four subordinate deliverables complete and validated
- [ ] Cross-artifact consistency report clean (no open issues)
- [ ] Research readiness score >= 80%
- [ ] BCHM briefing document prepared
- [ ] Open questions documented with recommended resolution paths
- [ ] EventBus PHASE_TRANSITION event published

SPRINT DELIVERABLES:
1. Unified research package (RCHM synthesis)
2. Per-persona deliverables (RC, CIA, STE, RIC)
3. Orchestration log documenting all coordination decisions
4. Cross-artifact consistency validation report
5. Handoff package for BCHM with readiness assessment

6. Cross-Category Protocol-to-Viz Pipeline (ASD + DVA + UAA)

Prompt: Protocol Design to Visualization to Accessibility Audit

You are coordinating a cross-category pipeline spanning
protocol_engineering, ux_visualization, and ux_visualization
categories to design a new protocol, visualize it, and audit
it for accessibility.

TEAM COMPOSITION:
- A2A Skill Designer (ASD, protocol_engineering): Protocol design
- D3 Visualization Architect (DVA, ux_visualization): Visualization
- UX Accessibility Auditor (UAA, ux_visualization): Accessibility audit

PIPELINE OBJECTIVE:
Design an A2A protocol for persona collaboration events, create an
interactive visualization of the protocol flow, and ensure the
visualization meets WCAG 2.1 AA accessibility standards.

STAGE 1 -- PROTOCOL DESIGN (ASD):

1. Define the collaboration event protocol:
   Event types:
   - COLLABORATION_REQUESTED: Persona A requests collaboration with B
   - COLLABORATION_ACCEPTED: Persona B accepts the request
   - COLLABORATION_DECLINED: Persona B declines with reason
   - COLLABORATION_COMPLETED: Joint work product delivered
   - COLLABORATION_TIMEOUT: Request expired without response

2. Design message schemas:
   ```json
   {
     "event_type": "COLLABORATION_REQUESTED",
     "source_persona": "RC",
     "target_persona": "BC",
     "collaboration_type": "review",
     "priority": "normal",
     "payload": {
       "artifact_id": "research-inventory-001",
       "requested_action": "Create blueprint from research",
       "deadline": "2026-04-15T17:00:00Z"
     },
     "metadata": {
       "sequence": 12345,
       "timestamp": "2026-03-29T10:00:00Z",
       "session_id": "sim-session-42"
     }
   }
   ```

3. Define state machine:
   IDLE -> REQUESTED -> ACCEPTED -> IN_PROGRESS -> COMPLETED
                    -> DECLINED
                    -> TIMEOUT

4. Handoff to DVA: Protocol specification with event types,
   message schemas, and state machine definition.

STAGE 2 -- VISUALIZATION DESIGN (DVA):

1. Protocol Flow Diagram:
   - Chart type: Animated sequence diagram
   - Layout: Persona nodes on left/right, messages as arrows
   - Animation: Messages travel along arrows with timing
   - State indicators: Color-coded badges on persona nodes

2. State Machine Visualization:
   - Chart type: Interactive state diagram
   - Nodes: States (IDLE, REQUESTED, ACCEPTED, etc.)
   - Edges: Transitions with event type labels
   - Current state: Highlighted node with pulse animation
   - History: Trail showing state transition sequence

3. Collaboration Network:
   - Chart type: Force-directed graph
   - Nodes: Personas that have participated in collaborations
   - Edges: Collaboration links, thickness = frequency
   - Filters: By event type, time range, persona category
   - Animation: New collaborations appear with spring animation

4. Dashboard Integration:
   - Embed all three charts in a tabbed panel
   - Shared time filter across all charts
   - Real-time updates via EventBus subscription
   - Export: SVG snapshot, CSV data table

5. Handoff to UAA: Visualization prototype with interactive
   elements, animations, and data tables.

STAGE 3 -- ACCESSIBILITY AUDIT (UAA):

1. Visual accessibility:
   - [ ] Color contrast >= 4.5:1 for all text (AA)
   - [ ] State colors distinguishable without color alone
   - [ ] Animation respects prefers-reduced-motion
   - [ ] Pattern fills available for colorblind users
   - [ ] Minimum font size 14px for chart labels

2. Interactive accessibility:
   - [ ] All interactive elements keyboard-accessible
   - [ ] Tab order follows logical reading sequence
   - [ ] Focus indicators visible on all focusable elements
   - [ ] Drag interactions have keyboard alternatives
   - [ ] Touch targets >= 44px on mobile

3. Content accessibility:
   - [ ] ARIA labels on all chart elements
   - [ ] Live regions announce state transitions
   - [ ] Data table alternatives for all charts
   - [ ] Screen reader can navigate collaboration flow
   - [ ] Alt text for exported SVG snapshots

4. Cognitive accessibility:
   - [ ] Animations have pause controls
   - [ ] Complex interactions have help tooltips
   - [ ] Error states have clear explanations
   - [ ] No time-limited interactions without extension option
   - [ ] Consistent navigation patterns across all tabs

5. Produce remediation report:
   | Chart | Issue | WCAG Criterion | Severity | Fix |
   |-------|-------|----------------|----------|-----|

PIPELINE DELIVERABLES:
1. Collaboration event protocol specification (ASD)
2. Three interactive D3.js visualization components (DVA)
3. Accessibility audit report with per-chart findings (UAA)
4. Cross-stage validation confirming protocol-to-viz fidelity
5. Remediation plan with estimated effort per fix
6. Final sign-off checklist for production deployment

Quick Reference: Multi-Persona Orchestration Template

You are coordinating a [N]-persona team to [objective].

TEAM COMPOSITION:
- [Persona Name] ([ID], [category]): [role in this workflow]
- [Persona Name] ([ID], [category]): [role in this workflow]
...

ORCHESTRATION PLAN:

PHASE 1 -- [PHASE NAME] ([LEAD_ID] leads):
1. [Task with deliverable specification]
2. [Task with deliverable specification]
Handoff to [NEXT_ID]: [handoff artifact description]

PHASE 2 -- [PHASE NAME] ([LEAD_ID] leads):
1. [Task with deliverable specification]
2. [Task with deliverable specification]
Handoff to [NEXT_ID]: [handoff artifact description]

...

DELIVERABLES:
1. [Per-phase deliverable with responsible persona]
2. [Cross-phase integration deliverable]
3. [Validation or audit report]