Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
313 changes: 313 additions & 0 deletions docs/architecture-guardrails/agr-1-01.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,313 @@
---
title: AGR-1.01 - Conformity Assessment Criteria (CAC)
sidebar_position: 3
---

# AGR-1.01 - Conformity Assessment Criteria (CAC) Guidelines

| Status | Created | Post-History |
|--------|-------------|---------------------------------|
| Active | 20-Jan-2025 | Initial creation |

## Why

Conformity Assessment Criteria (CAC) are essential for ensuring that participants and solutions in the Catena-X ecosystem meet the required standards and specifications. Properly defined and filled CACs ensure:

- **Consistency**: All participants are evaluated against the same criteria
- **Transparency**: Clear understanding of requirements for certification
- **Quality**: High standards across the ecosystem
- **Trust**: Confidence in certified solutions and participants

Without standardized CAC guidelines, the certification process would be inconsistent, leading to:
- Confusion among participants about requirements
- Inconsistent quality across certified solutions
- Potential security and compliance issues
- Reduced trust in the ecosystem

## Description

This guardrail defines how Conformity Assessment Criteria **must** be structured, documented, and maintained to ensure consistent and effective conformity assessment across the Catena-X ecosystem.

### Key Requirements

:::danger MUST
Every standard that requires conformity assessment **must** include clearly defined CACs that specify:
- What needs to be tested
- How it should be tested
- What constitutes a pass/fail
- Which tools or methods to use for verification
:::

:::warning SHOULD
CACs **should** be:
- Testable and verifiable
- Unambiguous and clear
- Aligned with the standard's normative sections
- Updatable as standards evolve
:::

## Scope

### In Scope
- CAC structure and formatting
- CAC content requirements
- CAC documentation and maintenance
- CAC verification methods

### Out of Scope
- Specific certification body procedures
- Pricing and commercial aspects of certification
- Legal liability and contracts

## Implementation Guidelines

### Step 1: Identify Conformity Requirements

Review the standard and identify all normative requirements that need verification:

- API endpoints and their behavior
- Data model compliance
- Security requirements
- Performance criteria
- Interoperability requirements

### Step 2: Define Test Criteria

For each requirement, define specific, measurable test criteria:

```markdown
## CAC: [Requirement Name]

### Criterion ID: CAC-[Standard]-[Number]

**Requirement**: [Reference to normative section]

**Test Description**: [How to test this requirement]

**Pass Criteria**:
- [Specific condition 1]
- [Specific condition 2]

**Fail Criteria**:
- [Specific condition that indicates failure]

**Verification Method**: [Manual/Automated/Tool-based]

**Tools Required**: [List of tools if applicable]
```

### Step 3: Document Evidence Requirements

Specify what evidence must be provided for each CAC:

- Screenshots
- Log files
- API responses
- Test results
- Configuration files

### Step 4: Link to Standard Sections

Each CAC **must** reference the specific normative section(s) it verifies:

```markdown
**Normative Reference**: Section 2.3.1 of CX-XXXX v1.0.0
```

## Examples

### Example 1: API Endpoint CAC

:::tip Best Practice
This example shows a well-structured CAC for an API endpoint requirement.
:::

```markdown
## CAC: Data Provider API - Health Check Endpoint

### Criterion ID: CAC-0018-001

**Requirement**: Data providers must implement a health check endpoint as
specified in Section 3.1.2 of CX-0018 Dataspace Connectivity v3.0.0

**Test Description**:
Call the `/health` endpoint and verify it returns a valid response.

**Pass Criteria**:
- Endpoint responds with HTTP 200 status code
- Response body contains JSON with `status` field
- Status field value is either "UP" or "DOWN"
- Response time is less than 2 seconds

**Fail Criteria**:
- Endpoint returns 4xx or 5xx status code
- Response is not valid JSON
- Response does not contain required fields
- Response time exceeds 2 seconds

**Verification Method**: Automated API testing

**Tools Required**:
- REST client (e.g., Postman, curl)
- API testing framework

**Evidence Required**:
- Screenshot or log showing successful API call
- Response body with timestamp
```

**Why this is good**:
- Clear, specific pass/fail criteria
- References the normative section
- Specifies verification method and tools
- Defines required evidence

### Example 2: Data Model CAC

:::tip Best Practice
CAC for semantic model compliance.
:::

```markdown
## CAC: Aspect Model Validation

### Criterion ID: CAC-0003-001

**Requirement**: All aspect models must be valid SAMM 2.1.0 models as
specified in Section 2.1 of CX-0003 SAMM v1.1.0

**Test Description**:
Validate the aspect model file against SAMM 2.1.0 specification using
the SAMM CLI validator.

**Pass Criteria**:
- SAMM CLI validator reports no errors
- Model file is valid Turtle RDF syntax
- All required SAMM elements are present
- Model follows Catena-X naming conventions

**Fail Criteria**:
- SAMM CLI validator reports errors
- Invalid RDF/Turtle syntax
- Missing required SAMM elements

**Verification Method**: Automated validation using SAMM CLI

**Tools Required**:
- SAMM CLI (version 2.1.0 or higher)
- Turtle syntax validator

**Command**:
```bash
samm aspect validate -i model.ttl
```

**Evidence Required**:
- Aspect model file (.ttl)
- SAMM CLI validation output
- Screenshot showing successful validation
```

### Example 3: Anti-Pattern - Vague CAC

:::danger What Not To Do
This example shows a poorly defined CAC that is too vague.
:::

```markdown
## CAC: System Performance

**Requirement**: System must perform well

**Test**: Check if system is fast enough

**Pass**: System works fine
**Fail**: System is too slow
```

**Why this is wrong**:
- No specific performance metrics
- No reference to standard section
- No clear pass/fail criteria
- No verification method specified
- "Fast enough" and "fine" are subjective
- Missing evidence requirements

## Verification

### Checklist for CAC Authors

- [ ] Each CAC has a unique identifier (CAC-[Standard]-[Number])
- [ ] References specific normative section(s) of the standard
- [ ] Includes clear, measurable pass criteria
- [ ] Includes clear, measurable fail criteria
- [ ] Specifies verification method (manual/automated/tool-based)
- [ ] Lists required tools and versions
- [ ] Defines required evidence
- [ ] Uses unambiguous language
- [ ] Is testable by a certification body
- [ ] Aligns with the standard's scope and requirements

### Checklist for CAC Reviewers

- [ ] All normative requirements have corresponding CACs
- [ ] No CACs test non-normative content
- [ ] CACs are consistent with each other
- [ ] CACs can be executed by third parties
- [ ] Evidence requirements are reasonable
- [ ] Pass/fail criteria are unambiguous

## References

### Related Guardrails
- [AGR-4.01](./agr-4-01.md) - Standard Documentation Requirements

### Standards
- [CX-0003 SAMM Semantic Aspect Meta Model](../standards/CX-0003-SAMMSemanticAspectMetaModel/CX-0003-SAMMSemanticAspectMetaModel.md)
- [CX-0018 Dataspace Connectivity](../standards/CX-0018-DataspaceConnectivity/CX-0018-DataspaceConnectivity.md)

## FAQ

### Question 1: How many CACs should a standard have?

The number of CACs depends on the complexity of the standard. Each normative requirement that needs verification should have at least one CAC. A simple standard might have 5-10 CACs, while a complex standard could have 50+.

### Question 2: Can CACs be automated?

Yes, automation is encouraged where possible. Automated CACs provide:
- Faster verification
- More consistent results
- Reduced costs
- Better repeatability

However, some requirements may require manual verification (e.g., documentation quality, architectural decisions).

### Question 3: Who writes the CACs?

CACs are typically written by:
- Standard authors (during standard creation)
- Architecture review board members
- Certification experts
- Technical working group members

CACs should be reviewed and approved through the standard governance process.

### Question 4: When should CACs be updated?

CACs should be updated when:
- The standard is revised
- New verification tools become available
- Feedback from certification bodies identifies issues
- Ambiguities or errors are discovered

## Change History

| Version | Date | Changes | Author |
|---------|-------------|----------------------------------|----------------------|
| 1.0 | 20-Jan-2025 | Initial release | Architecture Team |

---

:::note Feedback
Questions or suggestions about CAC guidelines? Please reach out through the standard contribution channels or the architecture working group.
:::
Loading
Loading