Collaboration Scoring Demo¶
This demo provides a guided walkthrough of the FCC human-in-the-loop collaboration engine -- from session creation through turn-taking, deliverable scoring, approval gates, and progress tracking.
Table of Contents¶
- Overview
- Prerequisites
- Running the Demo
- Step-by-Step Walkthrough
- Expected Output
- What You Will Learn
- Next Steps
Overview¶
The FCC collaboration engine supports multi-persona sessions where human reviewers and AI personas work together through structured turns. Each turn produces deliverables that are scored against quality dimensions, and approval gates ensure that outputs meet defined thresholds before advancing to the next workflow phase.
Prerequisites¶
- Python 3.10+ with FCC installed (
pip install -e ".[dev]") - No API key required -- all scoring uses the deterministic engine
Running the Demo¶
CLI¶
Programmatic¶
from fcc.demos.registry import DemoRegistry
from fcc.demos.runner import DemoRunner
registry = DemoRegistry.from_builtin()
demo = registry.get("collaboration_scoring")
runner = DemoRunner()
result = runner.run(demo)
print(f"Success: {result.success}, Steps: {result.steps_completed}/{result.total_steps}")
Step-by-Step Walkthrough¶
Step 1: Create Session¶
Initialize a collaboration session with assigned personas and a workflow context.
from fcc.collaboration.engine import CollaborationEngine
engine = CollaborationEngine()
session = engine.create_session(
personas=["architect", "critic"],
workflow="extended",
)
Step 2: Take Turns¶
Execute multiple collaboration turns where personas contribute sequentially, each recording input and output.
Step 3: Score Deliverables¶
Use the ScoringEngine to evaluate session deliverables across
dimensions: completeness, accuracy, clarity, and relevance.
from fcc.collaboration.scoring import ScoringEngine
scorer = ScoringEngine()
scores = scorer.score(deliverable)
print(f"Overall: {scores.overall:.2f}")
Step 4: Evaluate Approval Gate¶
Check whether deliverables meet the approval gate threshold. Failing dimensions receive remediation recommendations.
Step 5: Show Progress Tracking¶
Display the ProgressTracker dashboard with completion percentage,
turn count, and gate status.
Expected Output¶
Step 1/5: Create Session ......................... OK
Step 2/5: Take Turns ............................. OK
Step 3/5: Score Deliverables ..................... OK
Step 4/5: Evaluate Approval Gate ................. OK
Step 5/5: Show Progress Tracking ................. OK
Demo complete: 5/5 steps passed (6.2 ms)
What You Will Learn¶
- How to create and manage collaboration sessions
- Multi-turn conversation patterns with persona attribution
- Deliverable scoring across quality dimensions
- Approval gate evaluation and threshold configuration
- Progress tracking for session lifecycle management
- Integration with the EventBus for session event streaming
Next Steps¶
- Tutorial Tracking Demo -- Progress checkpoints
- Persona Dimensions Demo -- Persona profiles
- Messaging Patterns Demo -- Event bus patterns
- Guidebook Chapter 10