FAIR Compliance Workflow¶
How to use the FCC Agent Team Framework to ensure your research data and outputs comply with the FAIR principles: Findable, Accessible, Interoperable, and Reusable.
FAIR Principles Overview¶
| Principle | Description | FCC Support |
|---|---|---|
| Findable | Data has rich metadata and unique identifiers | Knowledge graph, semantic search |
| Accessible | Data is retrievable via standard protocols | Protocol layer (A2A, MCP) |
| Interoperable | Data uses shared vocabularies and standards | KG serializers (RDF, OWL, SKOS) |
| Reusable | Data has clear licensing and provenance | Trace recording, event replay |
Setting Up a FAIR Workflow¶
Step 1: Configure Open Science Personas¶
The open_science category contains personas specifically designed for
research workflows:
from fcc.personas.registry import PersonaRegistry
registry = PersonaRegistry.from_package_data()
open_science = registry.by_category("open_science")
for persona in open_science:
print(f" {persona.id}: {persona.name}")
print(f" Role: {persona.riscear.role}")
Step 2: Build a Research Knowledge Graph¶
Capture your research concepts, datasets, methods, and findings in a knowledge graph:
from fcc.knowledge.graph import KnowledgeGraph
kg = KnowledgeGraph()
# Add research entities
kg.add_node("dataset_001", node_type="DATASET", metadata={
"title": "Experimental Results Q4",
"doi": "10.1234/example",
"license": "CC-BY-4.0",
"format": "CSV",
})
kg.add_node("method_001", node_type="METHOD", metadata={
"title": "Bayesian Analysis",
"version": "2.1",
})
kg.add_edge("method_001", "dataset_001", edge_type="PRODUCES")
Step 3: Export to Standard Formats¶
Export your knowledge graph to interoperable formats:
from fcc.knowledge.serializers import RDFSerializer, SKOSSerializer
# Export as RDF for semantic web integration
rdf_serializer = RDFSerializer()
rdf_output = rdf_serializer.serialize(kg, format="turtle")
# Export as SKOS for taxonomy exchange
skos_serializer = SKOSSerializer()
skos_output = skos_serializer.serialize(kg)
Step 4: Enable Provenance Tracking¶
Use the event bus and tracing to record full provenance:
from fcc.messaging.bus import EventBus
from fcc.messaging.serialization import EventSerializer
from fcc.observability.tracing import FccTracer
bus = EventBus()
tracer = FccTracer()
# All operations are traced and event-logged
with tracer.start_span("data_processing") as span:
span.set_attribute("dataset_id", "dataset_001")
span.set_attribute("method", "bayesian_analysis")
# ... processing logic ...
FAIR Checklist¶
Use this checklist to verify FAIR compliance for your research outputs:
Findable: - [ ] Each dataset has a unique identifier (DOI, URI, or internal ID) - [ ] Rich metadata is stored in the knowledge graph - [ ] Metadata is indexed in the semantic search system - [ ] Datasets are discoverable via the persona search index
Accessible: - [ ] Data access protocols are documented - [ ] Authentication requirements are specified - [ ] Access metadata persists even if data is removed - [ ] Protocol bridge supports standard retrieval methods
Interoperable: - [ ] Knowledge graph uses standard ontologies (OWL, SKOS) - [ ] Data exports use standard formats (RDF, JSON-LD) - [ ] Vocabulary mappings are documented - [ ] Cross-namespace resolution is configured for federated datasets
Reusable: - [ ] License information is attached to all datasets - [ ] Provenance is captured via traces and events - [ ] Processing steps are recorded and reproducible - [ ] Quality metadata is available through governance gates
Related Resources¶
- Reproducibility Guide -- Reproducible research workflows
- Literature Review Agents -- Automated literature review
- Research Methodology -- FCC as research instrument
- Notebook
16_knowledge_graphs.ipynb-- Knowledge graph deep dive