CAPA Example
CAPA Example for Medical Device Software (SaMD)
If you've ever tried to write a CAPA for software as a medical device, you know the hardest part isn't documenting the issue. It's structuring the investigation in a way that actually holds up under audit.
Below is a real example of a CAPA investigation for a SaMD nonconformance involving algorithm performance and patient risk classification.
This example shows how root cause analysis, corrective actions, and effectiveness checks come together in a structured, audit-ready format.
Nonconformance Description
During pre-release system-level verification of a SaMD application (v2.3.1), the risk stratification algorithm reproducibly under-classified high-risk patients as moderate risk for cases involving BMI > 40 and comorbidity count >= 3.
The issue was observed in 5 of 7 test datasets representing high-risk populations. The defect was detected prior to release and contained within the test environment.
Potential impact includes delayed clinical intervention or inappropriate treatment planning if the issue were to reach production.
Problem Statement
The software failed to correctly classify high-risk patients under defined edge-case conditions due to unknown causes, representing a major software nonconformance with potential patient safety implications.
Root Cause Analysis (Summary)
A structured Ishikawa and 5-Why analysis identified multiple contributing factors across system design, data, and verification processes.
Key contributing areas:
- Data limitations - underrepresentation of extreme BMI populations in training datasets
- Preprocessing behavior - potential clipping or transformation of extreme input values
- Requirements gaps - lack of defined performance criteria for high-risk subpopulations
- Verification gaps - absence of subgroup-specific test coverage
Leading hypothesis:
The most probable root cause is insufficient representation of high-risk patient subpopulations in the training and validation datasets, resulting in poor model generalization under these conditions.
However, alternative contributing causes remain plausible and require verification.
Corrective Actions (Overview)
Corrective actions were structured across four layers:
1. Containment
- Release hold placed on affected software version
- Stakeholders notified across QA, RA, and development
2. Immediate Correction
- Reproduction with full instrumentation of preprocessing and inference pipeline
- Data audit to quantify representation gaps
3. Systemic Corrective Actions
- Data acquisition and augmentation targeting high-risk populations
- Model retraining with stratified validation
- Updates to design inputs and traceability within the Design History File (DHF)
- Implementation of subgroup-specific performance requirements
4. Preventive Actions
- Portfolio-wide review of similar models
- Updates to SOPs governing model validation and data representation
- Training across engineering, QA, and clinical teams
Effectiveness Checks
Effectiveness was defined using measurable and auditable criteria:
- Minimum subgroup sensitivity thresholds (for example, >=90%)
- Validation across multiple independent datasets
- Monitoring period with defined pass/fail conditions
- CI/CD gating to prevent regression
These criteria ensure the issue is not only corrected, but prevented from recurring.
Why This Matters
One of the most common CAPA failures is stopping at surface-level explanations like “model error” or “data issue.”
In this case, the deeper issue was systemic:
- lack of requirements for edge-case performance
- insufficient risk identification
- gaps in verification strategy
This is what auditors expect to see. Not just what failed, but why the system allowed it to fail.
See How This Was Generated
This CAPA example was generated using CAPA Engine, a structured investigation tool designed for regulated industries including medical devices, pharma, aerospace, and manufacturing.
It helps quality teams move from unstructured nonconformance descriptions to complete CAPA investigations with:
- root cause analysis
- corrective action plans
- effectiveness checks
- investigation reasoning and confidence
Frequently Asked Questions
What is a CAPA example?
A CAPA example shows how a nonconformance is investigated, including root cause analysis, corrective actions, and effectiveness checks.
What should be included in a CAPA?
A CAPA should include a clear problem statement, root cause analysis, corrective actions, and measurable effectiveness checks.
Related CAPA Examples
Curious how this compares to using ChatGPT for CAPA? Read our breakdown of where generic AI helps, where it falls short, and what a more structured CAPA approach looks like.
Try it yourself
This example was generated using CAPA Engine.
Paste your own nonconformance and see the full investigation
Move from an unstructured event description to a structured CAPA investigation with root cause analysis, corrective actions, and effectiveness checks.
Try CAPA Engine - Free Analysis