K Number
K213353
Device Name
Aorta-CAD
Date Cleared
2022-09-20

(347 days)

Product Code
Regulation Number
892.2070
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Aorta-CAD is a computer-assisted detection (CADe) software device that analyzes chest radiograph studies for suspicious regions of interest (ROIs). The device uses a deep learning algorithm to identify ROIs and produces boxes around the ROls. The boxes are labeled with one of the following radiographic findings: Aortic calcification or Dilated aorta.

Aorta-CAD is intended for use as a concurrent reading aid for physicians looking for ROIs with radiographic findings suggestive of Aortic Atherosclerosis or Aortic Ectasia. It does not replace the role of the physician or of other diagnosic testing in the standard of care. Aorta-CAD is indicated for adults only.

Device Description

Aorta-CAD is computer-assisted detection (CADe) software designed for physicians to increase the accurate detection of findings on chest radiographs that are suggestive of chronic conditions in the aorta. The ROIs are labeled with one of the following radiographic findings: Aortic calcification or Dilated aorta. Aorta-CAD is intended for use as a concurrent reading aid for physicians looking for suspicious ROIs with radiographic findings suggestive of Aortic Atherosclerosis or Aortic Ectasia. Aorta-CAD's output is available for physicians as a concurrent reading aid and does not replace the role of the physician or of other diagnostic testing in the standard of care for the distinct conditions. Aorta-CAD uses modern deep learning and computer vision techniques to analyze chest radiographs.

For each image within a study, Aorta-CAD generates a DICOM Presentation State file (output overlay). If any ROI is detected by Aorta-CAD in the study, the output overlay for each image includes which radiographic finding(s) were identified and what chronic condition in the aorta is suggested by these findings, such as "Aortic calcification suggestive of Aortic Atherosclerosis." In addition, if ROI(s) are detected in an image, bounding boxes surrounding each detected ROI are included in the output overlay for that image and are labeled with the radiographic findings, such as "Aortic calcification". If no ROI is detected by Aorta-CAD in the study, the output overlay for each image will include the text "No Aorta-CAD ROI(s)" and no bounding boxes will be included. Regardless of whether an ROI is detected, the overlay includes text identifying the X-ray study as analyzed by Aorta-CAD and a customer configurable message containing a link to our instructions for users to access labeling documents. The Aorta-CAD overlay can be toggled on or off by the physician within their Picture Archiving and Communication System (PACS) viewer, allowing for concurrent review of the X-ray study.

AI/ML Overview

Here's a breakdown of the acceptance criteria and the study proving the device meets them, based on the provided text:

1. Table of Acceptance Criteria and Reported Device Performance

The document doesn't explicitly list "acceptance criteria" in a separate section with pass/fail thresholds. Instead, it describes "performance assessment" and "clinical study" results which implicitly serve as the demonstration that the device performs acceptably and is substantially equivalent. The key performance metrics are presented from the standalone and clinical studies.

Metric (Implicit Acceptance Criteria)Reported Device Performance (Standalone Study)Reported Device Performance (Clinical Study - Aided vs. Unaided)
Overall Standalone Performance
Sensitivity0.910 (95% CI: 0.896, 0.922)Not directly comparable (MRMC study focuses on reader improvement, 0.910 refers to algorithm only)
Specificity0.896 (95% CI: 0.889, 0.902)Not directly comparable (MRMC study focuses on reader improvement, 0.896 refers to algorithm only)
AUC (Overall)0.974 (95% Bootstrap CI: 0.971, 0.977)Not directly comparable (MRMC study focuses on reader improvement, 0.974 refers to algorithm only)
Category-Specific Standalone Performance
Aortic calcification suggestive of Aortic Atherosclerosis (AUC)0.972 (95% Bootstrap CI: 0.967, 0.976)Reader AUC estimates significantly improved (p

§ 892.2070 Medical image analyzer.

(a)
Identification. Medical image analyzers, including computer-assisted/aided detection (CADe) devices for mammography breast cancer, ultrasound breast lesions, radiograph lung nodules, and radiograph dental caries detection, is a prescription device that is intended to identify, mark, highlight, or in any other manner direct the clinicians' attention to portions of a radiology image that may reveal abnormalities during interpretation of patient radiology images by the clinicians. This device incorporates pattern recognition and data analysis capabilities and operates on previously acquired medical images. This device is not intended to replace the review by a qualified radiologist, and is not intended to be used for triage, or to recommend diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithms including a description of the algorithm inputs and outputs, each major component or block, and algorithm limitations.
(ii) A detailed description of pre-specified performance testing methods and dataset(s) used to assess whether the device will improve reader performance as intended and to characterize the standalone device performance. Performance testing includes one or more standalone tests, side-by-side comparisons, or a reader study, as applicable.
(iii) Results from performance testing that demonstrate that the device improves reader performance in the intended use population when used in accordance with the instructions for use. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, predictive value, and diagnostic likelihood ratio). The test dataset must contain a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results; and cybersecurity).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the intended reading protocol.
(iii) A detailed description of the intended user and user training that addresses appropriate reading protocols for the device.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) Device operating instructions.
(viii) A detailed summary of the performance testing, including: test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.