K Number
K222746
Manufacturer
Date Cleared
2023-03-27

(196 days)

Product Code
Regulation Number
892.2070
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Overjet Caries Assist (OCA) is a radiological, automated, concurrent-read, computer-assisted detection (CADe) software intended to aid in the detection and segmentation of caries on bitewing and periapical radiographs. The device provides additional information for the dentist to use in their diagnosis of a tooth surface suspected of being carious. The device is not intended as a replacement for a complete dentist's review or their clinical judgment that takes into account other relevant information from the image, patient history, or actual in vivo clinical assessment.

Device Description

Overjet Caries Assist (OCA) is a radiological, automated, concurrent-read, computer-assisted detection (CADe) software intended to aid in the detection and segmentation of caries on bitewing and periapical radiographs. The device provides additional information for the dentist to use in their diagnosis of a tooth surface suspected of being carious. The device is not intended as a replacement for a complete dentist's review or their clinical judgment that takes into account other relevant information from the image, patient history, or actual in vivo clinical assessment.

OCA is a software-only device which operates in three layers: a Network Layer, a Presentation Layer, and a Decision Layer. Images are pulled in from a clinic/dental office, and the Machine Learning model creates predictions in the Decision Layer and results are pushed to the dashboard, which are in the Presentation Layer.

The machine learning system with the Decision Layer processes bitewing and periapical radiographs and annotates suspected carious lesions. It is comprised of four modules:

  • Image Preprocessor Module
  • Tooth Number Assignment Module
  • Caries Module
  • Post Processing
AI/ML Overview

This document describes the Overjet Caries Assist (OCA) device, a computer-assisted detection (CADe) software intended to aid dentists in the detection and segmentation of caries on bitewing and periapical radiographs.

Here's an analysis of the acceptance criteria and the study that proves the device meets them:

1. Table of Acceptance Criteria and Reported Device Performance:

The document doesn't explicitly state numerical acceptance criteria for sensitivity or specificity in a "table" format as initial goals. However, it does state a performance objective for the clinical reader improvement study: "Increase in dentist's sensitivity of greater than 15%". The other metrics are presented as reported performance from standalone and clinical evaluation studies.

MetricAcceptance Criteria (if stated)Reported Device PerformanceComments
Standalone PerformanceBitewing Images (n=1,293)
Overall SensitivityNot explicitly stated76.6% (73.8%, 79.4%)Based on surfaces (27,920)
Primary Caries SensitivityNot explicitly stated79.9% (77.1%, 82.7%)
Secondary Caries SensitivityNot explicitly stated60.9% (53.5%, 68.2%)
Enamel Caries SensitivityNot explicitly stated74.4% (70.4%, 78.3%)
Dentin Caries SensitivityNot explicitly stated79.5% (75.8%, 83.2%)
Overall SpecificityNot explicitly stated99.1% (98.9%, 99.2%)
Primary Caries Dice ScoreNot explicitly stated0.77 (0.76, 0.78)Pixel-level metric for true positives
Secondary Caries Dice ScoreNot explicitly stated0.73 (0.70, 0.75)Pixel-level metric for true positives
Enamel Caries Dice ScoreNot explicitly stated0.76 (0.75, 0.77)Pixel-level metric for true positives
Dentin Caries Dice ScoreNot explicitly stated0.77 (0.76, 0.79)Pixel-level metric for true positives
Periapical Images (n=1,314)
Overall SensitivityNot explicitly stated79.4% (76.1%, 82.8%)Based on surfaces (16,254)
Primary Caries SensitivityNot explicitly stated79.8% (76.0%, 83.7%)
Secondary Caries SensitivityNot explicitly stated77.9% (71.4%, 84.5%)
Enamel Caries SensitivityNot explicitly stated67.9% (60.7%, 75.1%)
Dentin Caries SensitivityNot explicitly stated84.9% (81.3%, 88.4%)
Overall SpecificityNot explicitly stated99.4% (99.2%, 99.5%)
Primary Caries Dice ScoreNot explicitly stated0.79 (0.78, 0.81)Pixel-level metric for true positives
Secondary Caries Dice ScoreNot explicitly stated0.79 (0.77, 0.82)Pixel-level metric for true positives
Enamel Caries Dice ScoreNot explicitly stated0.75 (0.73, 0.77)Pixel-level metric for true positives
Dentin Caries Dice ScoreNot explicitly stated0.81 (0.80, 0.82)Pixel-level metric for true positives
Clinical Evaluation (Reader Improvement)Bitewing Images (n=330)
Increase in reader sensitivity (overall)> 15%78.5% (assisted) vs. 64.6% (unassisted)Increase = 13.9%. This falls slightly below the stated >15% criterion, though the document concludes it demonstrates a "clear benefit". The text states "overall reader sensitivity improved from 64.6% (56.4%, 72.1%) to 78.5% (72.6%, 83.6%) unassisted vs assisted", if calculated as a direct percentage difference (78.5-64.6 = 13.9), it is slightly below 15%. If interpreted as (assisted/unassisted)-1 * 100 ((78.5/64.6)-1)*100 = 21.5%, then it meets the criterion. The framing in the document implies the latter.
Overall reader specificity (decrease)Not explicitly stated (implied minimal decrease is acceptable)98.6% (assisted) vs. 99.0% (unassisted)Decrease of 0.4%
Overall wAFROC AUC (increase)Not explicitly stated0.785 (assisted) vs. 0.729 (unassisted)Increase of 0.055, statistically significant (p 15%
Overall reader specificity (decrease)Not explicitly stated (implied minimal decrease is acceptable)97.6% (assisted) vs. 98.0% (unassisted)Decrease of 0.4%
Overall wAFROC AUC (increase)Not explicitly stated0.848 (assisted) vs. 0.799 (unassisted)Increase of 0.050, statistically significant (p

§ 892.2070 Medical image analyzer.

(a)
Identification. Medical image analyzers, including computer-assisted/aided detection (CADe) devices for mammography breast cancer, ultrasound breast lesions, radiograph lung nodules, and radiograph dental caries detection, is a prescription device that is intended to identify, mark, highlight, or in any other manner direct the clinicians' attention to portions of a radiology image that may reveal abnormalities during interpretation of patient radiology images by the clinicians. This device incorporates pattern recognition and data analysis capabilities and operates on previously acquired medical images. This device is not intended to replace the review by a qualified radiologist, and is not intended to be used for triage, or to recommend diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithms including a description of the algorithm inputs and outputs, each major component or block, and algorithm limitations.
(ii) A detailed description of pre-specified performance testing methods and dataset(s) used to assess whether the device will improve reader performance as intended and to characterize the standalone device performance. Performance testing includes one or more standalone tests, side-by-side comparisons, or a reader study, as applicable.
(iii) Results from performance testing that demonstrate that the device improves reader performance in the intended use population when used in accordance with the instructions for use. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, predictive value, and diagnostic likelihood ratio). The test dataset must contain a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results; and cybersecurity).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the intended reading protocol.
(iii) A detailed description of the intended user and user training that addresses appropriate reading protocols for the device.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) Device operating instructions.
(viii) A detailed summary of the performance testing, including: test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.